Oct 02 16:17:19 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 16:17:20 crc restorecon[4747]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:20 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 16:17:21 crc restorecon[4747]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 16:17:22 crc kubenswrapper[4882]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 16:17:22 crc kubenswrapper[4882]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 16:17:22 crc kubenswrapper[4882]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 16:17:22 crc kubenswrapper[4882]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 16:17:22 crc kubenswrapper[4882]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 16:17:22 crc kubenswrapper[4882]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.013493 4882 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028122 4882 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028159 4882 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028174 4882 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028188 4882 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028202 4882 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028242 4882 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028256 4882 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028267 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028277 4882 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028286 4882 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028296 4882 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028310 4882 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028323 4882 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028335 4882 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028345 4882 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028356 4882 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028367 4882 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028377 4882 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028387 4882 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028426 4882 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028439 4882 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028449 4882 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028460 4882 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028472 4882 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028482 4882 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028491 4882 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028501 4882 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028512 4882 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028525 4882 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028535 4882 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028545 4882 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028555 4882 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028565 4882 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028576 4882 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028586 4882 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028596 4882 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028606 4882 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028617 4882 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028627 4882 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028637 4882 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028650 4882 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028662 4882 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028672 4882 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028682 4882 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028693 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028703 4882 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028713 4882 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028723 4882 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028733 4882 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028743 4882 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028753 4882 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028763 4882 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028773 4882 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028783 4882 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028793 4882 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028806 4882 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028816 4882 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028826 4882 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028836 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028846 4882 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028855 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028865 4882 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028875 4882 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028885 4882 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028899 4882 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028914 4882 feature_gate.go:330] unrecognized feature gate: Example Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028925 4882 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028935 4882 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028946 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028964 4882 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.028975 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039735 4882 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039773 4882 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039801 4882 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039818 4882 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039835 4882 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039848 4882 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039865 4882 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039881 4882 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039893 4882 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039905 4882 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039918 4882 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039931 4882 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039943 4882 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039955 4882 flags.go:64] FLAG: --cgroup-root="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039966 4882 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039979 4882 flags.go:64] FLAG: --client-ca-file="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.039991 4882 flags.go:64] FLAG: --cloud-config="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040002 4882 flags.go:64] FLAG: --cloud-provider="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040014 4882 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040028 4882 flags.go:64] FLAG: --cluster-domain="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040042 4882 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040055 4882 flags.go:64] FLAG: --config-dir="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040066 4882 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040079 4882 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040094 4882 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040106 4882 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040117 4882 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040129 4882 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040141 4882 flags.go:64] FLAG: --contention-profiling="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040152 4882 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040165 4882 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040178 4882 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040189 4882 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040204 4882 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040255 4882 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040268 4882 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040281 4882 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040293 4882 flags.go:64] FLAG: --enable-server="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040305 4882 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040323 4882 flags.go:64] FLAG: --event-burst="100" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040335 4882 flags.go:64] FLAG: --event-qps="50" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040347 4882 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040359 4882 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040371 4882 flags.go:64] FLAG: --eviction-hard="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040386 4882 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040398 4882 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040410 4882 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040422 4882 flags.go:64] FLAG: --eviction-soft="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040435 4882 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040446 4882 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040459 4882 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040472 4882 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040483 4882 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040495 4882 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040507 4882 flags.go:64] FLAG: --feature-gates="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040522 4882 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040536 4882 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040549 4882 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040561 4882 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040573 4882 flags.go:64] FLAG: --healthz-port="10248" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040589 4882 flags.go:64] FLAG: --help="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040601 4882 flags.go:64] FLAG: --hostname-override="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040613 4882 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040625 4882 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040637 4882 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040649 4882 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040660 4882 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040672 4882 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040683 4882 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040695 4882 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040707 4882 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040718 4882 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040732 4882 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040743 4882 flags.go:64] FLAG: --kube-reserved="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040756 4882 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040767 4882 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040779 4882 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040791 4882 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040802 4882 flags.go:64] FLAG: --lock-file="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040814 4882 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040826 4882 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040838 4882 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040856 4882 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040869 4882 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040880 4882 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040892 4882 flags.go:64] FLAG: --logging-format="text" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040903 4882 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040916 4882 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040928 4882 flags.go:64] FLAG: --manifest-url="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040940 4882 flags.go:64] FLAG: --manifest-url-header="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040956 4882 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040969 4882 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.040987 4882 flags.go:64] FLAG: --max-pods="110" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041000 4882 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041012 4882 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041024 4882 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041036 4882 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041047 4882 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041060 4882 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041071 4882 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041100 4882 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041112 4882 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041124 4882 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041136 4882 flags.go:64] FLAG: --pod-cidr="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041148 4882 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041169 4882 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041180 4882 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041193 4882 flags.go:64] FLAG: --pods-per-core="0" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041205 4882 flags.go:64] FLAG: --port="10250" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041252 4882 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041274 4882 flags.go:64] FLAG: --provider-id="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041286 4882 flags.go:64] FLAG: --qos-reserved="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041298 4882 flags.go:64] FLAG: --read-only-port="10255" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041311 4882 flags.go:64] FLAG: --register-node="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041323 4882 flags.go:64] FLAG: --register-schedulable="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041335 4882 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041360 4882 flags.go:64] FLAG: --registry-burst="10" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041372 4882 flags.go:64] FLAG: --registry-qps="5" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041384 4882 flags.go:64] FLAG: --reserved-cpus="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041396 4882 flags.go:64] FLAG: --reserved-memory="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041411 4882 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041424 4882 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041437 4882 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041448 4882 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041460 4882 flags.go:64] FLAG: --runonce="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041473 4882 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041486 4882 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041499 4882 flags.go:64] FLAG: --seccomp-default="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041512 4882 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041525 4882 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041537 4882 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041549 4882 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041562 4882 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041575 4882 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041589 4882 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041602 4882 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041614 4882 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041626 4882 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041639 4882 flags.go:64] FLAG: --system-cgroups="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041650 4882 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041671 4882 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041682 4882 flags.go:64] FLAG: --tls-cert-file="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041701 4882 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041716 4882 flags.go:64] FLAG: --tls-min-version="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041728 4882 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041740 4882 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041752 4882 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041763 4882 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041776 4882 flags.go:64] FLAG: --v="2" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041793 4882 flags.go:64] FLAG: --version="false" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041810 4882 flags.go:64] FLAG: --vmodule="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041825 4882 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.041837 4882 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045745 4882 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045806 4882 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045815 4882 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045824 4882 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045832 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045841 4882 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045849 4882 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045861 4882 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045873 4882 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045882 4882 feature_gate.go:330] unrecognized feature gate: Example Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045892 4882 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045900 4882 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045908 4882 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045916 4882 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045927 4882 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045936 4882 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045945 4882 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045953 4882 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045961 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045971 4882 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045979 4882 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045987 4882 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.045995 4882 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046003 4882 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046010 4882 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046018 4882 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046026 4882 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046034 4882 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046042 4882 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046050 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046057 4882 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046066 4882 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046073 4882 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046084 4882 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046092 4882 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046100 4882 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046108 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046116 4882 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046124 4882 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046134 4882 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046143 4882 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046152 4882 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046160 4882 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046168 4882 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046176 4882 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046207 4882 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046248 4882 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046269 4882 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046296 4882 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046307 4882 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046316 4882 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046325 4882 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046334 4882 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046341 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046352 4882 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046378 4882 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046388 4882 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046397 4882 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046406 4882 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046415 4882 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046424 4882 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046432 4882 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046441 4882 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046448 4882 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046456 4882 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046465 4882 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046473 4882 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046481 4882 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046490 4882 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046497 4882 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.046505 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.046517 4882 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.073129 4882 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.073178 4882 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073334 4882 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073350 4882 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073361 4882 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073370 4882 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073378 4882 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073386 4882 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073394 4882 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073402 4882 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073411 4882 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073421 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073431 4882 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073440 4882 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073448 4882 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073456 4882 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073464 4882 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073471 4882 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073479 4882 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073487 4882 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073496 4882 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073504 4882 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073512 4882 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073520 4882 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073527 4882 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073536 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073545 4882 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073553 4882 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073564 4882 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073574 4882 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073583 4882 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073592 4882 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073601 4882 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073609 4882 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073619 4882 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073627 4882 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073638 4882 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073649 4882 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073658 4882 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073666 4882 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073674 4882 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073682 4882 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073690 4882 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073698 4882 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073706 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073714 4882 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073725 4882 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073735 4882 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073745 4882 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073753 4882 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073762 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073772 4882 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073780 4882 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073790 4882 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073802 4882 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073810 4882 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073820 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073828 4882 feature_gate.go:330] unrecognized feature gate: Example Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073836 4882 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073845 4882 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073854 4882 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073862 4882 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073870 4882 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073878 4882 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073885 4882 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073893 4882 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073902 4882 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073913 4882 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073923 4882 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073931 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073940 4882 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073948 4882 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.073970 4882 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.073983 4882 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074203 4882 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074248 4882 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074262 4882 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074272 4882 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074283 4882 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074293 4882 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074302 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074312 4882 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074321 4882 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074330 4882 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074338 4882 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074346 4882 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074354 4882 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074363 4882 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074370 4882 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074378 4882 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074387 4882 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074395 4882 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074404 4882 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074411 4882 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074419 4882 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074428 4882 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074435 4882 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074444 4882 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074452 4882 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074460 4882 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074467 4882 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074475 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074485 4882 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074495 4882 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074504 4882 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074512 4882 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074522 4882 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074530 4882 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074543 4882 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074552 4882 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074560 4882 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074570 4882 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074581 4882 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074590 4882 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074601 4882 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074611 4882 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074622 4882 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074632 4882 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074641 4882 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074649 4882 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074658 4882 feature_gate.go:330] unrecognized feature gate: Example Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074665 4882 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074673 4882 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074681 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074690 4882 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074701 4882 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074709 4882 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074717 4882 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074725 4882 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074733 4882 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074741 4882 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074749 4882 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074757 4882 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074765 4882 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074773 4882 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074781 4882 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074789 4882 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074797 4882 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074805 4882 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074814 4882 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074822 4882 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074829 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074837 4882 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074846 4882 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.074855 4882 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.074868 4882 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.075097 4882 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.096479 4882 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.096621 4882 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.139850 4882 server.go:997] "Starting client certificate rotation" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.139908 4882 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.140168 4882 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-28 03:35:18.621398483 +0000 UTC Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.140312 4882 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2075h17m56.481093335s for next certificate rotation Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.466029 4882 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.495377 4882 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.595053 4882 log.go:25] "Validated CRI v1 runtime API" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.642888 4882 log.go:25] "Validated CRI v1 image API" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.645611 4882 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.652626 4882 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-12-13-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.652669 4882 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.671144 4882 manager.go:217] Machine: {Timestamp:2025-10-02 16:17:22.668337668 +0000 UTC m=+1.417567205 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed BootID:06757550-4393-42fc-bde7-149710ea74c8 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7b:69:24 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7b:69:24 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4e:dc:8a Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:55:d0:69 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:28:47:f1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:62:f0:5e Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:ff:a6:78 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f2:dc:4b:78:52:9b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ea:a7:3c:42:55:ac Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.671436 4882 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.671640 4882 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.671979 4882 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.672170 4882 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.672207 4882 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.672502 4882 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.672515 4882 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.673055 4882 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.673092 4882 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.673800 4882 state_mem.go:36] "Initialized new in-memory state store" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.673904 4882 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.677190 4882 kubelet.go:418] "Attempting to sync node with API server" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.677249 4882 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.677273 4882 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.677290 4882 kubelet.go:324] "Adding apiserver pod source" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.677304 4882 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.681838 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.681952 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.681863 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.682049 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.683590 4882 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.684586 4882 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.686722 4882 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688063 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688085 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688148 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688156 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688168 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688176 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688184 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688198 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688207 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688232 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688247 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688255 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688279 4882 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.688753 4882 server.go:1280] "Started kubelet" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.689142 4882 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.689318 4882 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.689587 4882 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.690788 4882 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 16:17:22 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.690907 4882 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.690946 4882 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.690979 4882 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:59:52.044587017 +0000 UTC Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.691048 4882 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1880h42m29.353550942s for next certificate rotation Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.691270 4882 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.691358 4882 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.691603 4882 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.691756 4882 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.692633 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.696114 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.696343 4882 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="200ms" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.698454 4882 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.698572 4882 factory.go:55] Registering systemd factory Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.698581 4882 factory.go:221] Registration of the systemd container factory successfully Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.700331 4882 factory.go:153] Registering CRI-O factory Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.700399 4882 factory.go:221] Registration of the crio container factory successfully Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.700465 4882 factory.go:103] Registering Raw factory Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.700639 4882 manager.go:1196] Started watching for new ooms in manager Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.702721 4882 manager.go:319] Starting recovery of all containers Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.704727 4882 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ab8d817862e6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 16:17:22.688716394 +0000 UTC m=+1.437945921,LastTimestamp:2025-10-02 16:17:22.688716394 +0000 UTC m=+1.437945921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.709685 4882 server.go:460] "Adding debug handlers to kubelet server" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.714830 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715151 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715167 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715180 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715244 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715258 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715269 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715280 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715296 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715308 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715319 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715330 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715340 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715353 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715387 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715398 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715411 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715421 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715431 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.715443 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.717807 4882 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.717886 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.717911 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.717928 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.717945 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.717963 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.717980 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718009 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718041 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718063 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718083 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718102 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718116 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718130 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718178 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718194 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718230 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718246 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718260 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718325 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718340 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718352 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718367 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718383 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718396 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718412 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718428 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718443 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718493 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718506 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718520 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718533 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718547 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718568 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718583 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718600 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718614 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718627 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718640 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718654 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718668 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718682 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718696 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718710 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718730 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718748 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718761 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718775 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718788 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718809 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718822 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718834 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718849 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718862 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718875 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718907 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718919 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718931 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718944 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718957 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718971 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718981 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.718992 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719005 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719033 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719046 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719058 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719070 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719081 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719092 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719102 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719115 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719126 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719143 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719156 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719167 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719178 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719190 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719202 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719227 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719239 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719251 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719263 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719274 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719293 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719313 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719325 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719340 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719352 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719365 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719376 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719388 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719403 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719415 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719427 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719443 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719454 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719469 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719481 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719492 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719504 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719517 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719529 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719577 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719590 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719605 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719621 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719632 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719644 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719655 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719667 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719679 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719690 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719703 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719714 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719727 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719738 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719750 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719760 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719770 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719783 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719794 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719808 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719819 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719831 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719841 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719853 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719886 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719897 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719910 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719926 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719936 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719948 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719959 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719971 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.719985 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720001 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720016 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720030 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720043 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720054 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720065 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720083 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720095 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720106 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720119 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720131 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720145 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720156 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720169 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720180 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720192 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720204 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720231 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720243 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720256 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720267 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720279 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720297 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720314 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720325 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720338 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720368 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720379 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720390 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720400 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720411 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720421 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720433 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720445 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720456 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720468 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720480 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720490 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720503 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720515 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720525 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720536 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720552 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720562 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720572 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720583 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720594 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720604 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720614 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720623 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720635 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720646 4882 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720655 4882 reconstruct.go:97] "Volume reconstruction finished" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.720662 4882 reconciler.go:26] "Reconciler: start to sync state" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.744368 4882 manager.go:324] Recovery completed Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.754772 4882 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.754947 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.758807 4882 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.758894 4882 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.758913 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.758944 4882 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.759038 4882 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.758948 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.759109 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: W1002 16:17:22.760250 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.760360 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.761324 4882 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.761355 4882 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.761390 4882 state_mem.go:36] "Initialized new in-memory state store" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.786946 4882 policy_none.go:49] "None policy: Start" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.788090 4882 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.788127 4882 state_mem.go:35] "Initializing new in-memory state store" Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.792028 4882 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.845647 4882 manager.go:334] "Starting Device Plugin manager" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.845945 4882 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.845983 4882 server.go:79] "Starting device plugin registration server" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.846561 4882 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.846582 4882 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.847111 4882 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.847374 4882 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.847390 4882 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.856606 4882 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.859768 4882 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.859868 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.861109 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.861140 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.861163 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.861331 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.861596 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.861652 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862359 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862412 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862422 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862621 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862651 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862669 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862680 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862740 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.862769 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.863749 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.863779 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.863789 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.863940 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864038 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864077 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864083 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864110 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864138 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864609 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864628 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864638 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864750 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.864979 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865024 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865047 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865058 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865161 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865611 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865634 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865644 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865758 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.865782 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.866345 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.866384 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.866400 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.866404 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.866421 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.866409 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.897357 4882 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="400ms" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922241 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922288 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922314 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922333 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922349 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922366 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922388 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922446 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922473 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922573 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922619 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922646 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922732 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922768 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.922788 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.946878 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.948581 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.948633 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.948644 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:22 crc kubenswrapper[4882]: I1002 16:17:22.948675 4882 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 16:17:22 crc kubenswrapper[4882]: E1002 16:17:22.949361 4882 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024468 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024542 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024576 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024606 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024631 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024654 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024680 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024700 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024728 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024727 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024783 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024848 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024870 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024750 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024867 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024934 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024819 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024951 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024898 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024917 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024917 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.024990 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.025073 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.025101 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.025122 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.025157 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.025165 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.025203 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.025269 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.025405 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.149988 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.151718 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.151767 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.151778 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.151812 4882 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 16:17:23 crc kubenswrapper[4882]: E1002 16:17:23.152400 4882 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.192085 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.213376 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.236409 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.246143 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.255533 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:23 crc kubenswrapper[4882]: W1002 16:17:23.258932 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-511883e446d1e25daa94cee7d6b76f0ed2248cd3cd5863d89b79d0d903408b79 WatchSource:0}: Error finding container 511883e446d1e25daa94cee7d6b76f0ed2248cd3cd5863d89b79d0d903408b79: Status 404 returned error can't find the container with id 511883e446d1e25daa94cee7d6b76f0ed2248cd3cd5863d89b79d0d903408b79 Oct 02 16:17:23 crc kubenswrapper[4882]: W1002 16:17:23.261460 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-92ee44289a24605106e092891251ddda9df096f7f29c0348e68e22a301f81fc1 WatchSource:0}: Error finding container 92ee44289a24605106e092891251ddda9df096f7f29c0348e68e22a301f81fc1: Status 404 returned error can't find the container with id 92ee44289a24605106e092891251ddda9df096f7f29c0348e68e22a301f81fc1 Oct 02 16:17:23 crc kubenswrapper[4882]: W1002 16:17:23.276271 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-19c76293c2a36f351cfaa1fdf4e9a818e3d34531df4a48bcc1bb7e41ba9f35bc WatchSource:0}: Error finding container 19c76293c2a36f351cfaa1fdf4e9a818e3d34531df4a48bcc1bb7e41ba9f35bc: Status 404 returned error can't find the container with id 19c76293c2a36f351cfaa1fdf4e9a818e3d34531df4a48bcc1bb7e41ba9f35bc Oct 02 16:17:23 crc kubenswrapper[4882]: W1002 16:17:23.281008 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0ea3a3a76fa964c5b5c9664819505287599bcd6dc870c31453a6ed3626871369 WatchSource:0}: Error finding container 0ea3a3a76fa964c5b5c9664819505287599bcd6dc870c31453a6ed3626871369: Status 404 returned error can't find the container with id 0ea3a3a76fa964c5b5c9664819505287599bcd6dc870c31453a6ed3626871369 Oct 02 16:17:23 crc kubenswrapper[4882]: E1002 16:17:23.298841 4882 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="800ms" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.553244 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.554827 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.554887 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.554902 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.554945 4882 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 16:17:23 crc kubenswrapper[4882]: E1002 16:17:23.555512 4882 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.690858 4882 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.765400 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0ea3a3a76fa964c5b5c9664819505287599bcd6dc870c31453a6ed3626871369"} Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.767003 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"19c76293c2a36f351cfaa1fdf4e9a818e3d34531df4a48bcc1bb7e41ba9f35bc"} Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.768089 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fd715e3ce7ac8c7b0b619b4fb695b3179d0b9b88d3c0e05fc0a3b2f037759cfb"} Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.769075 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"92ee44289a24605106e092891251ddda9df096f7f29c0348e68e22a301f81fc1"} Oct 02 16:17:23 crc kubenswrapper[4882]: I1002 16:17:23.770703 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"511883e446d1e25daa94cee7d6b76f0ed2248cd3cd5863d89b79d0d903408b79"} Oct 02 16:17:23 crc kubenswrapper[4882]: W1002 16:17:23.968009 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:23 crc kubenswrapper[4882]: E1002 16:17:23.968118 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:24 crc kubenswrapper[4882]: W1002 16:17:24.090393 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:24 crc kubenswrapper[4882]: E1002 16:17:24.090479 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:24 crc kubenswrapper[4882]: E1002 16:17:24.100080 4882 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="1.6s" Oct 02 16:17:24 crc kubenswrapper[4882]: W1002 16:17:24.100083 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:24 crc kubenswrapper[4882]: E1002 16:17:24.100161 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:24 crc kubenswrapper[4882]: W1002 16:17:24.113738 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:24 crc kubenswrapper[4882]: E1002 16:17:24.113787 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.356028 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.357304 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.357367 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.357382 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.357422 4882 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 16:17:24 crc kubenswrapper[4882]: E1002 16:17:24.358056 4882 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.691033 4882 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.775570 4882 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7c03d481801cc3d51360721acc47c09adcab6442081ea918b609dd10fc41cc6d" exitCode=0 Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.775685 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7c03d481801cc3d51360721acc47c09adcab6442081ea918b609dd10fc41cc6d"} Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.775954 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.777628 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.777751 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.777775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.779668 4882 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="25540079e35db5c08d3dcc3a7315fb05f70f9cfa0d6c3e5dedd94f29d7f074bc" exitCode=0 Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.779746 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.779837 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"25540079e35db5c08d3dcc3a7315fb05f70f9cfa0d6c3e5dedd94f29d7f074bc"} Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.780824 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.780868 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.780887 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.781785 4882 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134" exitCode=0 Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.781823 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134"} Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.781946 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.782766 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.782805 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.782820 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.785487 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b"} Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.785537 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90"} Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.785551 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a"} Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.785562 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29"} Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.785655 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.786506 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.786544 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.786558 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.787686 4882 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931" exitCode=0 Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.787737 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931"} Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.787810 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.788528 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.788570 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.788584 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.790385 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.791910 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.791963 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:24 crc kubenswrapper[4882]: I1002 16:17:24.791976 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.691078 4882 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:25 crc kubenswrapper[4882]: E1002 16:17:25.701894 4882 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="3.2s" Oct 02 16:17:25 crc kubenswrapper[4882]: W1002 16:17:25.790821 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:25 crc kubenswrapper[4882]: E1002 16:17:25.790895 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.792511 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.792553 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.792564 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.792667 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.793571 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.793595 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.793604 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.796975 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.797043 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.797062 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.797071 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.797076 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.797220 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.798194 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.798249 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.798260 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.799381 4882 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="38d0daddf06b7b550f787ec452c76f0041bb5c46ce475ff222d8d617053333a2" exitCode=0 Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.799450 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.799467 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"38d0daddf06b7b550f787ec452c76f0041bb5c46ce475ff222d8d617053333a2"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.801258 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.801307 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.801322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.806537 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.806624 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7e6f309f493e6a6be7dfd71077f5a26a675984ac41209e28644ba55d6d5d6303"} Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.806579 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.808022 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.808063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.808076 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.808769 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.808861 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.808938 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.958886 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.960709 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.960752 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.960764 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:25 crc kubenswrapper[4882]: I1002 16:17:25.960794 4882 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 16:17:25 crc kubenswrapper[4882]: E1002 16:17:25.961400 4882 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Oct 02 16:17:26 crc kubenswrapper[4882]: W1002 16:17:26.092122 4882 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Oct 02 16:17:26 crc kubenswrapper[4882]: E1002 16:17:26.092243 4882 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.435057 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.812144 4882 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cd2c000b284b44d9104399934318a43cf25828b4b98c1ffac94d97ee551e07d2" exitCode=0 Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.812306 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.812321 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cd2c000b284b44d9104399934318a43cf25828b4b98c1ffac94d97ee551e07d2"} Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.812426 4882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.812307 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.812449 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.812417 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813373 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813406 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813420 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813760 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813783 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813792 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813909 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813947 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.813969 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.814300 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.814367 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:26 crc kubenswrapper[4882]: I1002 16:17:26.814381 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:27 crc kubenswrapper[4882]: I1002 16:17:27.819253 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2886d696f279bf26a2ad225a6494d7a430fa7d64cad4eac9033a011dbf9601fd"} Oct 02 16:17:27 crc kubenswrapper[4882]: I1002 16:17:27.819335 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"23e38c592b42647f68ce0a9a703dd94bf5e280423f3902c75f3d6ee04ef7b103"} Oct 02 16:17:27 crc kubenswrapper[4882]: I1002 16:17:27.819363 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c2add3b7b2089860865e3837dfbbd6fe04ef65ba9ad926e324fed7e15711ca56"} Oct 02 16:17:27 crc kubenswrapper[4882]: I1002 16:17:27.819386 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b3ce6c9b1d3bb637d405d1c509148276bf79960b327f2ac8908f3768c662cf92"} Oct 02 16:17:27 crc kubenswrapper[4882]: I1002 16:17:27.819368 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:27 crc kubenswrapper[4882]: I1002 16:17:27.821009 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:27 crc kubenswrapper[4882]: I1002 16:17:27.821047 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:27 crc kubenswrapper[4882]: I1002 16:17:27.821057 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:28 crc kubenswrapper[4882]: I1002 16:17:28.829555 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dd5737a11e90bfd6c64367e344e1bd681965368e16a926f2e9c7357c7a9fea4d"} Oct 02 16:17:28 crc kubenswrapper[4882]: I1002 16:17:28.832348 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:28 crc kubenswrapper[4882]: I1002 16:17:28.837417 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:28 crc kubenswrapper[4882]: I1002 16:17:28.837527 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:28 crc kubenswrapper[4882]: I1002 16:17:28.837559 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.162261 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.163761 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.163811 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.163825 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.163850 4882 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.683794 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.835523 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.837427 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.837482 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:29 crc kubenswrapper[4882]: I1002 16:17:29.837498 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.075653 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.076011 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.078070 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.078134 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.078159 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.186371 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.186708 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.188564 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.188639 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.188659 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.541324 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.547944 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.606028 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.606361 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.607806 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.607842 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.607855 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.689966 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.763736 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.837843 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.837910 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.838118 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.839109 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.839181 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.839198 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.839360 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.839406 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.839427 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.840486 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.840535 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:30 crc kubenswrapper[4882]: I1002 16:17:30.840561 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.564347 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.839512 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.839553 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.840760 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.840788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.840802 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.841465 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.841510 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:31 crc kubenswrapper[4882]: I1002 16:17:31.841530 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:32 crc kubenswrapper[4882]: I1002 16:17:32.203604 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:32 crc kubenswrapper[4882]: I1002 16:17:32.842262 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:32 crc kubenswrapper[4882]: I1002 16:17:32.843799 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:32 crc kubenswrapper[4882]: I1002 16:17:32.843874 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:32 crc kubenswrapper[4882]: I1002 16:17:32.843890 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:32 crc kubenswrapper[4882]: E1002 16:17:32.856896 4882 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 16:17:33 crc kubenswrapper[4882]: I1002 16:17:33.764916 4882 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 16:17:33 crc kubenswrapper[4882]: I1002 16:17:33.765037 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 16:17:36 crc kubenswrapper[4882]: E1002 16:17:36.307729 4882 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.186ab8d817862e6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 16:17:22.688716394 +0000 UTC m=+1.437945921,LastTimestamp:2025-10-02 16:17:22.688716394 +0000 UTC m=+1.437945921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 16:17:36 crc kubenswrapper[4882]: I1002 16:17:36.345502 4882 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 16:17:36 crc kubenswrapper[4882]: I1002 16:17:36.345585 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 16:17:36 crc kubenswrapper[4882]: I1002 16:17:36.354194 4882 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 16:17:36 crc kubenswrapper[4882]: I1002 16:17:36.354311 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.712118 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.712343 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.713765 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.713876 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.713963 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.723690 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.858395 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.859761 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.859795 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:39 crc kubenswrapper[4882]: I1002 16:17:39.859809 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.192869 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.193039 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.194448 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.194499 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.194516 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.695015 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.695264 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.696427 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.696481 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.696494 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.699124 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.863692 4882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.863787 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.865639 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.865680 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:40 crc kubenswrapper[4882]: I1002 16:17:40.865689 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.340589 4882 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.343357 4882 trace.go:236] Trace[1932711417]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 16:17:31.207) (total time: 10136ms): Oct 02 16:17:41 crc kubenswrapper[4882]: Trace[1932711417]: ---"Objects listed" error: 10136ms (16:17:41.343) Oct 02 16:17:41 crc kubenswrapper[4882]: Trace[1932711417]: [10.136097117s] [10.136097117s] END Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.343393 4882 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.390519 4882 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.401353 4882 trace.go:236] Trace[1993446227]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 16:17:26.519) (total time: 14881ms): Oct 02 16:17:41 crc kubenswrapper[4882]: Trace[1993446227]: ---"Objects listed" error: 14881ms (16:17:41.401) Oct 02 16:17:41 crc kubenswrapper[4882]: Trace[1993446227]: [14.881602917s] [14.881602917s] END Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.401399 4882 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.403591 4882 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.404147 4882 trace.go:236] Trace[490043405]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 16:17:26.503) (total time: 14900ms): Oct 02 16:17:41 crc kubenswrapper[4882]: Trace[490043405]: ---"Objects listed" error: 14900ms (16:17:41.403) Oct 02 16:17:41 crc kubenswrapper[4882]: Trace[490043405]: [14.900339644s] [14.900339644s] END Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.404192 4882 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.405116 4882 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.485463 4882 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55736->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.485559 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:55736->192.168.126.11:17697: read: connection reset by peer" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.486133 4882 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.486226 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.493269 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.497428 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.603460 4882 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.603566 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.689930 4882 apiserver.go:52] "Watching apiserver" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.759661 4882 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.760167 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.760670 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.760810 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.760899 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.760982 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.761068 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.761116 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.761282 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.761411 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.761537 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.763689 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.764297 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.764327 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.764497 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.764598 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.764647 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.765076 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.765496 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.765613 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792095 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792430 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792476 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792499 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792557 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792583 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792609 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792613 4882 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792658 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.792666 4882 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792678 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792699 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.792723 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.792751 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:42.292726794 +0000 UTC m=+21.041956531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.792847 4882 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.792895 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:42.292883278 +0000 UTC m=+21.042112805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.793522 4882 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.805773 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.805821 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.805840 4882 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.805927 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:42.305902446 +0000 UTC m=+21.055131973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.807276 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.807678 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.807733 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.807756 4882 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:41 crc kubenswrapper[4882]: E1002 16:17:41.807884 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:42.307845424 +0000 UTC m=+21.057075131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.811390 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.812126 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.820946 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.831910 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893718 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893808 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893828 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893854 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893877 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893913 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893937 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893958 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.893980 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894003 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894022 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894048 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894065 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894085 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894127 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894148 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894167 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894189 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894184 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894211 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894255 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894277 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894298 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894318 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894335 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894353 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894381 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894406 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894432 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894450 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894471 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894497 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894516 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894522 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894536 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894554 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894573 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894564 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894592 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894705 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894724 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894771 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894807 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894835 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894872 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894903 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.894964 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895037 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895083 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895103 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895134 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895198 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895242 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895266 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895430 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895494 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:41 crc kubenswrapper[4882]: I1002 16:17:41.895509 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.084849 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.084923 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.084949 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085105 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085260 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085420 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085670 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085796 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085860 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085819 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085956 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.085965 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086046 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086084 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086105 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086256 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086474 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086719 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086767 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086780 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086792 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086936 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.086955 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.086986 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:17:42.586961535 +0000 UTC m=+21.336191172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087026 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087055 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087068 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087096 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087122 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087154 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087182 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087212 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087255 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087278 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087302 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087326 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087355 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087350 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087407 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087568 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087597 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087618 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087639 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087662 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087686 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087710 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087734 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087761 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087783 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087837 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087869 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087897 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087923 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087927 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.087981 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088011 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088171 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088178 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088199 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088247 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088273 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088299 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088320 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088344 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088364 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088383 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088406 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088431 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088455 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088480 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088495 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088504 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088545 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088572 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088595 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088618 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088644 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088666 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088760 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088787 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088810 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088811 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088834 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088857 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088878 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088900 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088917 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088932 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088948 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088965 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088981 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.088997 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089011 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089029 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089048 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089064 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089111 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089129 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089147 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089163 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089179 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089195 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089244 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089263 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089278 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089293 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089310 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089325 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089342 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089358 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089374 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089398 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089414 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089430 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089449 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089464 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089479 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089495 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089513 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089528 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089544 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089562 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089579 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089594 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089610 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089625 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089641 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089657 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089674 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089690 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089707 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089722 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089738 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089753 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089770 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089786 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089802 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089819 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089848 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089863 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089886 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089904 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089922 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089938 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089955 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089972 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089990 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090006 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090021 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090039 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090055 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090072 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090089 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090105 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090123 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090142 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090161 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090179 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090196 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090251 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090272 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090289 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090326 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090344 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090363 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090379 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090410 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090428 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090447 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090464 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090483 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090501 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090543 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090580 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090599 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090644 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090661 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090677 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090713 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090725 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090737 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090747 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090757 4882 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090766 4882 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090775 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090785 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090795 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090805 4882 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090816 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090825 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090834 4882 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090846 4882 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090856 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090865 4882 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090877 4882 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090887 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090898 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090908 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090917 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090926 4882 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090934 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090944 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090954 4882 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090963 4882 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090972 4882 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090980 4882 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090989 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090998 4882 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.091008 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.100813 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.112187 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089045 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.089707 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090066 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090287 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.090791 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.091047 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.091509 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.091994 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.101257 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.101570 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.101706 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.101848 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.102074 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.102386 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.102421 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.102635 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.102697 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.102844 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.102951 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.103114 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.103160 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.103512 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.103622 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.103761 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.103763 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.104010 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.113086 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.104383 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.104477 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.104400 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.104658 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.104892 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.104730 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.104971 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.105028 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.105117 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.105173 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.105309 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.105727 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.105928 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.083658 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.106303 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.106348 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.106322 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.106709 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.106859 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.106837 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.107076 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.107631 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.107696 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.108059 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.108086 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.108313 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.108312 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.108594 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.109092 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.109103 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.109159 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.109407 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.109749 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.109972 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.109953 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.110026 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.110888 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.110900 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.110915 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.111160 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.111584 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.111440 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.111922 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.111897 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.112320 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.112338 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.113631 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.113642 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.113935 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.114141 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.114359 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.114557 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.114562 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.114928 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.115165 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.115202 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.112338 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.115711 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.116241 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.116158 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.116321 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.116664 4882 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.116677 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.117233 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.117702 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.117843 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.117819 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118186 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118306 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118368 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118393 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118647 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118761 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118761 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118949 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118959 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.118993 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.119107 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.119133 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.119266 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.119268 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.119431 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.119573 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.120083 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.120184 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.120438 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.120643 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.124057 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.124674 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.124831 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.125281 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.131004 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.131439 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.138248 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192538 4882 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192595 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192610 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192624 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192636 4882 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192648 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192660 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192676 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192689 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192702 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192713 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192726 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192738 4882 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192749 4882 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192761 4882 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192772 4882 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192785 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192796 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192809 4882 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192822 4882 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192834 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192845 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192856 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192867 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192877 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192889 4882 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192902 4882 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192914 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192926 4882 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192939 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192950 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192960 4882 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192971 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192982 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.192994 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193006 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193017 4882 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193028 4882 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193037 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193048 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193058 4882 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193069 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193080 4882 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193089 4882 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193337 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193355 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193368 4882 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193379 4882 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193392 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193405 4882 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193418 4882 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193430 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193443 4882 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193454 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193467 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193477 4882 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193488 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193499 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193510 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193522 4882 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193533 4882 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193546 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193559 4882 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193571 4882 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193582 4882 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193591 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193602 4882 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193613 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193623 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193636 4882 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193647 4882 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193657 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193668 4882 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193679 4882 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193691 4882 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193744 4882 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193756 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193767 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193777 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193788 4882 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193801 4882 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193813 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193824 4882 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193834 4882 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193844 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193854 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193864 4882 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193879 4882 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193890 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193901 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193913 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193926 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193939 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193950 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193961 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193972 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193984 4882 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.193995 4882 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194007 4882 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194018 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194029 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194039 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194051 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194061 4882 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194071 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194082 4882 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194093 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194105 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194117 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194129 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194140 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194152 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.194164 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.295323 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.295619 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.295517 4882 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.295926 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:43.295906594 +0000 UTC m=+22.045136121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.295733 4882 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.296120 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:43.296111519 +0000 UTC m=+22.045341046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.372581 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.372690 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.372714 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.372840 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.372840 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.372944 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.373001 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.373048 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.373133 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.373131 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.373069 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.373243 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.373425 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.373030 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.374073 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.374759 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.375599 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.376373 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.376481 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.376535 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.376517 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.376669 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.377196 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.378154 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.378901 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.379039 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.379157 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.379626 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.380172 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.380213 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.380378 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.380917 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.380951 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.381280 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.381542 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.381643 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.382708 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.382904 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.382953 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.383114 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.383154 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.383226 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.383437 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.384530 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.384559 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.385291 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.385300 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.385738 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.386119 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.391803 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.392874 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.394837 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.395132 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396803 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396849 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396901 4882 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396916 4882 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396926 4882 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396952 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396960 4882 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396970 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396979 4882 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396987 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.396995 4882 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.396999 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.397018 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.397031 4882 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.397054 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.397065 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.397074 4882 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.397078 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:43.397060793 +0000 UTC m=+22.146290320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397003 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.397157 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:43.397130955 +0000 UTC m=+22.146360482 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397189 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397233 4882 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397245 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397255 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397264 4882 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397273 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397283 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397294 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397304 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397317 4882 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397326 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397335 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397344 4882 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397353 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397361 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397370 4882 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397384 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397395 4882 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397406 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397415 4882 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397426 4882 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397438 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397448 4882 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397457 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397467 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397477 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397487 4882 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397496 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397506 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397516 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397579 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397588 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397599 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397608 4882 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397621 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397630 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397639 4882 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397648 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.397656 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: W1002 16:17:42.451406 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7d32b93c8630b5387afd94a90775335ff1864e108c241ad79d70200e3ce04986 WatchSource:0}: Error finding container 7d32b93c8630b5387afd94a90775335ff1864e108c241ad79d70200e3ce04986: Status 404 returned error can't find the container with id 7d32b93c8630b5387afd94a90775335ff1864e108c241ad79d70200e3ce04986 Oct 02 16:17:42 crc kubenswrapper[4882]: W1002 16:17:42.453059 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ed75b12800bf95bf1f7680d4afb11d950bb467a37e8efc854adf596e7d32e1e6 WatchSource:0}: Error finding container ed75b12800bf95bf1f7680d4afb11d950bb467a37e8efc854adf596e7d32e1e6: Status 404 returned error can't find the container with id ed75b12800bf95bf1f7680d4afb11d950bb467a37e8efc854adf596e7d32e1e6 Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.598890 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:42 crc kubenswrapper[4882]: E1002 16:17:42.599158 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:17:43.599116824 +0000 UTC m=+22.348346371 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.660770 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.699884 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.768805 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.769447 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.770786 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.771431 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.772451 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.772953 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.773680 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.778825 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.792170 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.794833 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.795517 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.796250 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.796754 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.797411 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.798053 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.798656 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.799171 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.801059 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.801715 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.802575 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.803129 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.803515 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.803871 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.804709 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.805270 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.809487 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.810594 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.811352 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.812657 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.814068 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.814736 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.815336 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.816170 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.816641 4882 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.816744 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.818618 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.818878 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.819398 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.819859 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.821616 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.823068 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.823617 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.825563 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.826392 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.827773 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.828388 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.829390 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.829624 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.833713 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.834444 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.835166 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.835877 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.837072 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.837731 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.839791 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.840384 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.841015 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.842266 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.842796 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.846131 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.865346 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.872772 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.874467 4882 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9" exitCode=255 Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.874531 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9"} Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.876229 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ed75b12800bf95bf1f7680d4afb11d950bb467a37e8efc854adf596e7d32e1e6"} Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.879801 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7d32b93c8630b5387afd94a90775335ff1864e108c241ad79d70200e3ce04986"} Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.881579 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"75c7c4d74c7e17d7d2c28ed96cd288492042b3412acd35d2dd9e099991a39ab1"} Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.884351 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.884677 4882 scope.go:117] "RemoveContainer" containerID="ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.885264 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.896099 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.925621 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.943967 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.962111 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.976029 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:42 crc kubenswrapper[4882]: I1002 16:17:42.991310 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:43 crc kubenswrapper[4882]: I1002 16:17:43.307138 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:43 crc kubenswrapper[4882]: I1002 16:17:43.307232 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.307332 4882 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.307437 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:45.307379619 +0000 UTC m=+24.056609146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.307537 4882 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.307584 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:45.307563574 +0000 UTC m=+24.056793101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:43 crc kubenswrapper[4882]: I1002 16:17:43.408646 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:43 crc kubenswrapper[4882]: I1002 16:17:43.408702 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.408838 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.408855 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.408867 4882 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.408929 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:45.408908997 +0000 UTC m=+24.158138524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.408985 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.409032 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.409050 4882 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.409132 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:45.409105401 +0000 UTC m=+24.158335108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:43 crc kubenswrapper[4882]: I1002 16:17:43.609624 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.609873 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:17:45.609838071 +0000 UTC m=+24.359067748 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:17:43 crc kubenswrapper[4882]: I1002 16:17:43.759717 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:43 crc kubenswrapper[4882]: I1002 16:17:43.759746 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:43 crc kubenswrapper[4882]: I1002 16:17:43.759727 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.759872 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.759997 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:43 crc kubenswrapper[4882]: E1002 16:17:43.760123 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.069544 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5mdgg"] Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.069882 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5wxmw"] Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.070130 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5mdgg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.070812 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-ppcpg"] Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.070981 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jxblv"] Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.071303 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.071791 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.071791 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.074362 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.074818 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.075162 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.076228 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.076836 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.076884 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.077173 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.080388 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.080425 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.080401 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.080480 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.080482 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.081096 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.081245 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.084479 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.084693 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.102344 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114139 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-cni-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114183 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-cni-bin\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114202 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3bd74899-256b-4b2c-bcd7-51fb1d08991b-mcd-auth-proxy-config\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114245 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-system-cni-dir\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114289 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-multus-certs\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114319 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-cnibin\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114343 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-etc-kubernetes\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114368 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-cni-multus\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114389 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-kubelet\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114419 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-system-cni-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114439 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-k8s-cni-cncf-io\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114464 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5bq5\" (UniqueName: \"kubernetes.io/projected/75c66302-c57d-41c8-a014-97f26deffd27-kube-api-access-m5bq5\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114489 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld9lr\" (UniqueName: \"kubernetes.io/projected/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-kube-api-access-ld9lr\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114520 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114545 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7mfz\" (UniqueName: \"kubernetes.io/projected/5f246815-70b9-4dc7-972c-76ba716075ba-kube-api-access-x7mfz\") pod \"node-resolver-5mdgg\" (UID: \"5f246815-70b9-4dc7-972c-76ba716075ba\") " pod="openshift-dns/node-resolver-5mdgg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114565 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-netns\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114586 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-conf-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114608 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-os-release\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114628 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-cnibin\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114647 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-daemon-config\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114668 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3bd74899-256b-4b2c-bcd7-51fb1d08991b-rootfs\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114687 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bd74899-256b-4b2c-bcd7-51fb1d08991b-proxy-tls\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114707 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f246815-70b9-4dc7-972c-76ba716075ba-hosts-file\") pod \"node-resolver-5mdgg\" (UID: \"5f246815-70b9-4dc7-972c-76ba716075ba\") " pod="openshift-dns/node-resolver-5mdgg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114724 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-os-release\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114785 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c66302-c57d-41c8-a014-97f26deffd27-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114828 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-cni-binary-copy\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114859 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-socket-dir-parent\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114881 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8q9l\" (UniqueName: \"kubernetes.io/projected/3bd74899-256b-4b2c-bcd7-51fb1d08991b-kube-api-access-b8q9l\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114901 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75c66302-c57d-41c8-a014-97f26deffd27-cni-binary-copy\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.114921 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-hostroot\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.115299 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.127173 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.139107 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.153491 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.171958 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.189934 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.201397 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.210926 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215377 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c66302-c57d-41c8-a014-97f26deffd27-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215414 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-cni-binary-copy\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215434 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-socket-dir-parent\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215451 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8q9l\" (UniqueName: \"kubernetes.io/projected/3bd74899-256b-4b2c-bcd7-51fb1d08991b-kube-api-access-b8q9l\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215474 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75c66302-c57d-41c8-a014-97f26deffd27-cni-binary-copy\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215491 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-hostroot\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215507 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-cni-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215526 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-cni-bin\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215544 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3bd74899-256b-4b2c-bcd7-51fb1d08991b-mcd-auth-proxy-config\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215561 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-system-cni-dir\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215586 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-multus-certs\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215608 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-cnibin\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215602 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-hostroot\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215671 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-etc-kubernetes\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215624 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-etc-kubernetes\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215707 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-kubelet\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215733 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-system-cni-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215749 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-k8s-cni-cncf-io\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215763 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-cni-multus\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215764 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-socket-dir-parent\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215779 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5bq5\" (UniqueName: \"kubernetes.io/projected/75c66302-c57d-41c8-a014-97f26deffd27-kube-api-access-m5bq5\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215874 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld9lr\" (UniqueName: \"kubernetes.io/projected/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-kube-api-access-ld9lr\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215922 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215953 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7mfz\" (UniqueName: \"kubernetes.io/projected/5f246815-70b9-4dc7-972c-76ba716075ba-kube-api-access-x7mfz\") pod \"node-resolver-5mdgg\" (UID: \"5f246815-70b9-4dc7-972c-76ba716075ba\") " pod="openshift-dns/node-resolver-5mdgg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215963 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-cni-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215989 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-netns\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216017 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-conf-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216044 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-kubelet\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216302 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75c66302-c57d-41c8-a014-97f26deffd27-cni-binary-copy\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216302 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-cni-binary-copy\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216048 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-os-release\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216350 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-cnibin\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216380 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-cnibin\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.215994 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-cni-bin\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216373 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-system-cni-dir\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216488 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-cnibin\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216502 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-netns\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216516 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-multus-certs\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216506 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-var-lib-cni-multus\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216453 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-host-run-k8s-cni-cncf-io\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216529 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-conf-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216731 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3bd74899-256b-4b2c-bcd7-51fb1d08991b-mcd-auth-proxy-config\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216771 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-daemon-config\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.217972 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c66302-c57d-41c8-a014-97f26deffd27-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218043 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3bd74899-256b-4b2c-bcd7-51fb1d08991b-rootfs\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218133 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-system-cni-dir\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218241 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.216880 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3bd74899-256b-4b2c-bcd7-51fb1d08991b-rootfs\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218331 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bd74899-256b-4b2c-bcd7-51fb1d08991b-proxy-tls\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218373 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f246815-70b9-4dc7-972c-76ba716075ba-hosts-file\") pod \"node-resolver-5mdgg\" (UID: \"5f246815-70b9-4dc7-972c-76ba716075ba\") " pod="openshift-dns/node-resolver-5mdgg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218409 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-os-release\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218453 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-multus-daemon-config\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218531 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-os-release\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218625 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f246815-70b9-4dc7-972c-76ba716075ba-hosts-file\") pod \"node-resolver-5mdgg\" (UID: \"5f246815-70b9-4dc7-972c-76ba716075ba\") " pod="openshift-dns/node-resolver-5mdgg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.218651 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75c66302-c57d-41c8-a014-97f26deffd27-os-release\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.222569 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.226766 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bd74899-256b-4b2c-bcd7-51fb1d08991b-proxy-tls\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.232783 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.233959 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5bq5\" (UniqueName: \"kubernetes.io/projected/75c66302-c57d-41c8-a014-97f26deffd27-kube-api-access-m5bq5\") pod \"multus-additional-cni-plugins-5wxmw\" (UID: \"75c66302-c57d-41c8-a014-97f26deffd27\") " pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.237058 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8q9l\" (UniqueName: \"kubernetes.io/projected/3bd74899-256b-4b2c-bcd7-51fb1d08991b-kube-api-access-b8q9l\") pod \"machine-config-daemon-jxblv\" (UID: \"3bd74899-256b-4b2c-bcd7-51fb1d08991b\") " pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.237401 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7mfz\" (UniqueName: \"kubernetes.io/projected/5f246815-70b9-4dc7-972c-76ba716075ba-kube-api-access-x7mfz\") pod \"node-resolver-5mdgg\" (UID: \"5f246815-70b9-4dc7-972c-76ba716075ba\") " pod="openshift-dns/node-resolver-5mdgg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.247750 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld9lr\" (UniqueName: \"kubernetes.io/projected/565a5a5f-e220-4ce6-86a7-f94f9dbe48c2-kube-api-access-ld9lr\") pod \"multus-ppcpg\" (UID: \"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\") " pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.265833 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.299938 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.319025 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.329462 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.344730 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.361617 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.376446 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.391904 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5mdgg" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.395087 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.402928 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.416754 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.417803 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ppcpg" Oct 02 16:17:44 crc kubenswrapper[4882]: W1002 16:17:44.436446 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod565a5a5f_e220_4ce6_86a7_f94f9dbe48c2.slice/crio-35178992544ea939b21165fc4261a2b3003d99464e82eb8fb5668df5d5ac5b1d WatchSource:0}: Error finding container 35178992544ea939b21165fc4261a2b3003d99464e82eb8fb5668df5d5ac5b1d: Status 404 returned error can't find the container with id 35178992544ea939b21165fc4261a2b3003d99464e82eb8fb5668df5d5ac5b1d Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.436904 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" Oct 02 16:17:44 crc kubenswrapper[4882]: W1002 16:17:44.464949 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c66302_c57d_41c8_a014_97f26deffd27.slice/crio-4012467aa22bf0a284e76a3d0a9ef49c62db8723dfffe83206dc8a687fb3370c WatchSource:0}: Error finding container 4012467aa22bf0a284e76a3d0a9ef49c62db8723dfffe83206dc8a687fb3370c: Status 404 returned error can't find the container with id 4012467aa22bf0a284e76a3d0a9ef49c62db8723dfffe83206dc8a687fb3370c Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.781837 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p6qjz"] Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.782766 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.785492 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.786101 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.786506 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.786668 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.786966 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.787196 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.787455 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.805575 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.820453 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825130 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7911af1a-fc82-463b-b72d-9c55e5073e45-ovn-node-metrics-cert\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825160 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-bin\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825177 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7zv5\" (UniqueName: \"kubernetes.io/projected/7911af1a-fc82-463b-b72d-9c55e5073e45-kube-api-access-l7zv5\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825205 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-systemd-units\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825240 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-env-overrides\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825268 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-log-socket\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825292 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-ovn\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825310 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-systemd\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825329 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-netd\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825499 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-slash\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825559 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825595 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-config\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825637 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-kubelet\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825715 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-script-lib\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825747 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-netns\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825784 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825808 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-ovn-kubernetes\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.825843 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-var-lib-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.826002 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-etc-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.826063 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-node-log\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.833264 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.846537 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.862463 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.875932 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.887680 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerStarted","Data":"caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.887735 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerStarted","Data":"4012467aa22bf0a284e76a3d0a9ef49c62db8723dfffe83206dc8a687fb3370c"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.890127 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.890237 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.891542 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppcpg" event={"ID":"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2","Type":"ContainerStarted","Data":"e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.891612 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppcpg" event={"ID":"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2","Type":"ContainerStarted","Data":"35178992544ea939b21165fc4261a2b3003d99464e82eb8fb5668df5d5ac5b1d"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.893333 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.893386 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.893399 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"a299ea0da2a65060182644de4147260cabaa358593d0dd27ba34aa1a65694c2b"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.894572 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5mdgg" event={"ID":"5f246815-70b9-4dc7-972c-76ba716075ba","Type":"ContainerStarted","Data":"a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.894610 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5mdgg" event={"ID":"5f246815-70b9-4dc7-972c-76ba716075ba","Type":"ContainerStarted","Data":"f16c74fa16737d3989f9a2f81f1453fd5a6d606b516cb8f0b89c899d6a056eeb"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.894903 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.896486 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.898556 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.898897 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.900136 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef"} Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.909912 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927532 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-log-socket\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927639 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-ovn\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927708 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-log-socket\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927777 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-systemd\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927723 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-systemd\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927854 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-netd\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927855 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-ovn\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927961 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-kubelet\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927969 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-netd\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.927993 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-slash\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928018 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928047 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-config\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928142 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-script-lib\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928201 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928229 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-slash\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928292 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-netns\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928307 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-kubelet\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928353 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-var-lib-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928387 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-ovn-kubernetes\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928451 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-etc-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928481 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-node-log\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928561 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7911af1a-fc82-463b-b72d-9c55e5073e45-ovn-node-metrics-cert\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928603 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-bin\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928605 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-ovn-kubernetes\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928646 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7zv5\" (UniqueName: \"kubernetes.io/projected/7911af1a-fc82-463b-b72d-9c55e5073e45-kube-api-access-l7zv5\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928682 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-env-overrides\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928716 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-systemd-units\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928815 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-var-lib-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928855 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-bin\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.928902 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-systemd-units\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.929227 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-etc-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.929329 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-netns\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.929356 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-openvswitch\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.929441 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.929337 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-node-log\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.929546 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-config\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.929898 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-script-lib\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.930025 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-env-overrides\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.933703 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7911af1a-fc82-463b-b72d-9c55e5073e45-ovn-node-metrics-cert\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.954938 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7zv5\" (UniqueName: \"kubernetes.io/projected/7911af1a-fc82-463b-b72d-9c55e5073e45-kube-api-access-l7zv5\") pod \"ovnkube-node-p6qjz\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.964171 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:44 crc kubenswrapper[4882]: I1002 16:17:44.989050 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:44Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.009129 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.022929 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.037035 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.054330 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.070697 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.083254 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.093680 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.097575 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.113334 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: W1002 16:17:45.115472 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7911af1a_fc82_463b_b72d_9c55e5073e45.slice/crio-fdec1172613743de8c303b4c53b53335599821745f00a282075465cd5c22b253 WatchSource:0}: Error finding container fdec1172613743de8c303b4c53b53335599821745f00a282075465cd5c22b253: Status 404 returned error can't find the container with id fdec1172613743de8c303b4c53b53335599821745f00a282075465cd5c22b253 Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.140562 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.156792 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.168754 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.195049 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.208135 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.220925 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.237518 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.251292 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.334297 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.334379 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.334493 4882 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.334542 4882 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.334570 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:49.334552542 +0000 UTC m=+28.083782069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.334723 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:49.334698136 +0000 UTC m=+28.083927863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.435693 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.435762 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.435904 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.435956 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.435970 4882 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.435904 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.436036 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:49.436017108 +0000 UTC m=+28.185246635 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.436046 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.436062 4882 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.436105 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:49.43608921 +0000 UTC m=+28.185318957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.637029 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.637322 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:17:49.63728193 +0000 UTC m=+28.386511497 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.759763 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.760187 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.760341 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.760399 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.760533 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:45 crc kubenswrapper[4882]: E1002 16:17:45.760634 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.905305 4882 generic.go:334] "Generic (PLEG): container finished" podID="75c66302-c57d-41c8-a014-97f26deffd27" containerID="caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e" exitCode=0 Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.905377 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerDied","Data":"caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e"} Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.908036 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00" exitCode=0 Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.908099 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.908143 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"fdec1172613743de8c303b4c53b53335599821745f00a282075465cd5c22b253"} Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.913630 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae"} Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.922908 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.938119 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.953358 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.977875 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:45 crc kubenswrapper[4882]: I1002 16:17:45.993164 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:45Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.018400 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.043800 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.075281 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.106413 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.120784 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.134489 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.146103 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.158915 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.171902 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.187784 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.203278 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.217146 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.230559 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.244747 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.259857 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.280906 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.299118 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.314071 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pm5z5"] Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.315076 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.318250 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.318607 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.318927 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.319066 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.326091 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.339516 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.345533 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-serviceca\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.345583 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wftw\" (UniqueName: \"kubernetes.io/projected/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-kube-api-access-4wftw\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.345674 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-host\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.354183 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.365916 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.377338 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.392962 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.419020 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.431178 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.443111 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.446310 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-host\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.446391 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-serviceca\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.446416 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wftw\" (UniqueName: \"kubernetes.io/projected/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-kube-api-access-4wftw\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.446794 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-host\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.448834 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-serviceca\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.456989 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.473888 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wftw\" (UniqueName: \"kubernetes.io/projected/7cb59ee5-09c2-4d31-b1aa-1d2a57035275-kube-api-access-4wftw\") pod \"node-ca-pm5z5\" (UID: \"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\") " pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.476522 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.487541 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.498633 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.511934 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.529944 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.545395 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.558184 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.569627 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.649125 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pm5z5" Oct 02 16:17:46 crc kubenswrapper[4882]: W1002 16:17:46.664330 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cb59ee5_09c2_4d31_b1aa_1d2a57035275.slice/crio-133ebb1d9e67cbc10fc20e7e64a2ffa18a561de628996f4e7d18c5f11fea9f22 WatchSource:0}: Error finding container 133ebb1d9e67cbc10fc20e7e64a2ffa18a561de628996f4e7d18c5f11fea9f22: Status 404 returned error can't find the container with id 133ebb1d9e67cbc10fc20e7e64a2ffa18a561de628996f4e7d18c5f11fea9f22 Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.919821 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerStarted","Data":"29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba"} Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.924055 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.924135 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.924150 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.924167 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.925800 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pm5z5" event={"ID":"7cb59ee5-09c2-4d31-b1aa-1d2a57035275","Type":"ContainerStarted","Data":"133ebb1d9e67cbc10fc20e7e64a2ffa18a561de628996f4e7d18c5f11fea9f22"} Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.938013 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.953395 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.966891 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.983704 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:46 crc kubenswrapper[4882]: I1002 16:17:46.999043 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:46Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.011598 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.026366 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.058657 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.072608 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.086929 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.101014 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.112068 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.126393 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.147407 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.759609 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.759690 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.759722 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.759769 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.759845 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.759912 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.805640 4882 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.808929 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.808979 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.809003 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.809136 4882 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.815475 4882 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.815914 4882 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.817185 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.817250 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.817268 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.817293 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.817308 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:47Z","lastTransitionTime":"2025-10-02T16:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.835079 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.839346 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.839382 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.839396 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.839415 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.839429 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:47Z","lastTransitionTime":"2025-10-02T16:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.853475 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.859094 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.859174 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.859196 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.859250 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.859281 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:47Z","lastTransitionTime":"2025-10-02T16:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.874879 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.880292 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.880334 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.880348 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.880371 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.880385 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:47Z","lastTransitionTime":"2025-10-02T16:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.894595 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.899045 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.899086 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.899095 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.899112 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.899124 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:47Z","lastTransitionTime":"2025-10-02T16:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.913102 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: E1002 16:17:47.913243 4882 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.915225 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.915263 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.915273 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.915290 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.915302 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:47Z","lastTransitionTime":"2025-10-02T16:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.932324 4882 generic.go:334] "Generic (PLEG): container finished" podID="75c66302-c57d-41c8-a014-97f26deffd27" containerID="29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba" exitCode=0 Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.932404 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerDied","Data":"29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba"} Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.938344 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.938399 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.939836 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pm5z5" event={"ID":"7cb59ee5-09c2-4d31-b1aa-1d2a57035275","Type":"ContainerStarted","Data":"c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1"} Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.947109 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.959623 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.973344 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:47 crc kubenswrapper[4882]: I1002 16:17:47.987621 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.002392 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:47Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.015987 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.017639 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.017699 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.017716 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.017741 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.017755 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.031436 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.047071 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.058112 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.073668 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.096910 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.113979 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.120449 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.120489 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.120499 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.120519 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.120530 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.129394 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.145369 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.160919 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.173185 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.196035 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.215961 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.228265 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.228316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.228328 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.228348 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.228363 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.237573 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.251271 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.264511 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.277501 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.290271 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.302128 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.316245 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.330084 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.331152 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.331196 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.331209 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.331247 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.331261 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.344326 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.358891 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.434694 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.434754 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.434769 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.434788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.434800 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.537314 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.537371 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.537385 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.537407 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.537423 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.640043 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.640087 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.640097 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.640111 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.640121 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.742512 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.742572 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.742588 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.742607 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.742620 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.845455 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.845489 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.845500 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.845517 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.845528 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.945088 4882 generic.go:334] "Generic (PLEG): container finished" podID="75c66302-c57d-41c8-a014-97f26deffd27" containerID="32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab" exitCode=0 Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.945171 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerDied","Data":"32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.947363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.947461 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.947476 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.947494 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.947507 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:48Z","lastTransitionTime":"2025-10-02T16:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.960463 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.972898 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:48 crc kubenswrapper[4882]: I1002 16:17:48.983317 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.000622 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:48Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.021619 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.049164 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.051121 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.051173 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.051184 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.051206 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.051236 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.073988 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.091442 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.111896 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.126484 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.139880 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.157760 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.157815 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.157829 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.157849 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.157864 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.161514 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.172167 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.185376 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.260579 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.260615 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.260624 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.260639 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.260650 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.363908 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.363966 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.363977 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.364000 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.364012 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.377673 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.377741 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.377869 4882 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.377942 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:57.377927039 +0000 UTC m=+36.127156566 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.377946 4882 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.378058 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:57.378033172 +0000 UTC m=+36.127262859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.467069 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.467268 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.467412 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.467436 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.467449 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.478752 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.478819 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.478961 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.478985 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.479007 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.479021 4882 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.478989 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.479075 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:57.479059418 +0000 UTC m=+36.228288945 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.479078 4882 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.479139 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 16:17:57.479120839 +0000 UTC m=+36.228350366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.570360 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.570410 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.570421 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.570445 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.570457 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.673975 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.674027 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.674040 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.674061 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.674078 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.681483 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.681713 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:17:57.681688252 +0000 UTC m=+36.430917799 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.759493 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.759556 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.759603 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.759681 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.759867 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:49 crc kubenswrapper[4882]: E1002 16:17:49.760020 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.778000 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.778056 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.778071 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.778095 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.778110 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.893445 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.893509 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.893528 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.893557 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.893574 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.952587 4882 generic.go:334] "Generic (PLEG): container finished" podID="75c66302-c57d-41c8-a014-97f26deffd27" containerID="aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576" exitCode=0 Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.952682 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerDied","Data":"aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.958536 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.976168 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.995912 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.995951 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.995965 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.995983 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.995995 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:49Z","lastTransitionTime":"2025-10-02T16:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:49 crc kubenswrapper[4882]: I1002 16:17:49.998020 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.023343 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.038044 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.055827 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.083980 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.099328 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.099375 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.099388 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.099408 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.099546 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.100172 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.113677 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.126707 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.145002 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.161863 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.176305 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.192455 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.201745 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.201797 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.201809 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.201830 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.201844 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.207441 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.305709 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.306093 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.306161 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.306283 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.306355 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.409804 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.409847 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.409857 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.409872 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.409882 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.512513 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.512584 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.512596 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.512613 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.512626 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.615136 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.615187 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.615203 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.615261 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.615277 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.718462 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.718519 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.718534 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.718554 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.718568 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.822026 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.822080 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.822091 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.822109 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.822125 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.925146 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.925195 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.925223 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.925244 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.925257 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:50Z","lastTransitionTime":"2025-10-02T16:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.965755 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerStarted","Data":"1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe"} Oct 02 16:17:50 crc kubenswrapper[4882]: I1002 16:17:50.986203 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:50Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.003146 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.022847 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.028164 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.028237 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.028253 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.028272 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.028288 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.042051 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.060038 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.074031 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.094877 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.111986 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.129285 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.130891 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.130915 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.130925 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.130941 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.130955 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.147110 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.168010 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.183864 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.197857 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.211074 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.233031 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.233074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.233086 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.233104 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.233118 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.335999 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.336049 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.336060 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.336078 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.336091 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.438105 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.438148 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.438161 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.438181 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.438195 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.541158 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.541202 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.541229 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.541252 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.541268 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.644056 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.644132 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.644148 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.644173 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.644188 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.747103 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.747147 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.747156 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.747175 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.747184 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.759342 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.759347 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:51 crc kubenswrapper[4882]: E1002 16:17:51.759475 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:51 crc kubenswrapper[4882]: E1002 16:17:51.759592 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.759347 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:51 crc kubenswrapper[4882]: E1002 16:17:51.759882 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.850720 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.851020 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.851117 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.851238 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.851333 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.954143 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.954200 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.954231 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.954252 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.954266 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:51Z","lastTransitionTime":"2025-10-02T16:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.971955 4882 generic.go:334] "Generic (PLEG): container finished" podID="75c66302-c57d-41c8-a014-97f26deffd27" containerID="1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe" exitCode=0 Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.972036 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerDied","Data":"1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe"} Oct 02 16:17:51 crc kubenswrapper[4882]: I1002 16:17:51.988384 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.001641 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:51Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.017712 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.031459 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.044802 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.058872 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.058943 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.058957 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.058979 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.059017 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.060427 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.073907 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.090413 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.103631 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.118453 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.130497 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.151809 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.161938 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.161976 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.161987 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.162005 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.162017 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.172060 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.187598 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.265162 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.265239 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.265254 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.265278 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.265292 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.368498 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.368553 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.368563 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.368581 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.368594 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.472579 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.472630 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.472641 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.472659 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.472672 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.576775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.576858 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.576877 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.576908 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.576928 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.680854 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.680915 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.680935 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.680959 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.680978 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.783702 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.783766 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.783781 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.783804 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.783820 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.784941 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.806036 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.823025 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.835700 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.857644 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.886378 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.886645 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.886715 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.886730 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.886753 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.886765 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.905741 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.926252 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.938903 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.955771 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.975413 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.988040 4882 generic.go:334] "Generic (PLEG): container finished" podID="75c66302-c57d-41c8-a014-97f26deffd27" containerID="71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525" exitCode=0 Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.988106 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerDied","Data":"71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.990346 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.990394 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.990414 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.990439 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.990468 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:52Z","lastTransitionTime":"2025-10-02T16:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.993851 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.995163 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72"} Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.995613 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:52 crc kubenswrapper[4882]: I1002 16:17:52.995682 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.019780 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.033257 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.034720 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.036989 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.056890 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.072803 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.090873 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.096045 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.096098 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.096110 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.096129 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.096141 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.109765 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.123643 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.138107 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.154161 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.167899 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.180749 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.193136 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.200269 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.200318 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.200332 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.200351 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.200361 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.204565 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.246363 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.268719 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.286577 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.303836 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.303879 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.303892 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.303908 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.303918 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.407285 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.407736 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.407757 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.407782 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.407800 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.510998 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.511038 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.511049 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.511066 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.511075 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.615063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.615575 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.615764 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.615920 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.616059 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.719748 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.719807 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.719821 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.719840 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.719851 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.760383 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:53 crc kubenswrapper[4882]: E1002 16:17:53.760807 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.760525 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:53 crc kubenswrapper[4882]: E1002 16:17:53.761777 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.760491 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:53 crc kubenswrapper[4882]: E1002 16:17:53.762091 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.822419 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.822688 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.822881 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.823024 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.823198 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.927025 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.927555 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.927781 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.927961 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:53 crc kubenswrapper[4882]: I1002 16:17:53.928115 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:53Z","lastTransitionTime":"2025-10-02T16:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.003351 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" event={"ID":"75c66302-c57d-41c8-a014-97f26deffd27","Type":"ContainerStarted","Data":"2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.003520 4882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.026548 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.033114 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.033171 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.033188 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.033230 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.033246 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.046419 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.072405 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.107494 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.126051 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.136184 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.136243 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.136254 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.136272 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.136287 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.140855 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.155396 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.173720 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.190799 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.202501 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.216285 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.235246 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.239330 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.239374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.239385 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.239405 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.239419 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.249288 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.265391 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:54Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.341758 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.341814 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.341827 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.341845 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.341869 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.445238 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.445299 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.445316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.445346 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.445360 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.548189 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.548305 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.548315 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.548331 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.548342 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.651286 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.651348 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.651362 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.651384 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.651692 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.756332 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.756368 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.756377 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.756396 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.756409 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.859488 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.859550 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.859565 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.859587 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.859603 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.962425 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.962505 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.962517 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.962537 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:54 crc kubenswrapper[4882]: I1002 16:17:54.962550 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:54Z","lastTransitionTime":"2025-10-02T16:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.010821 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/0.log" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.015856 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72" exitCode=1 Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.015933 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.017584 4882 scope.go:117] "RemoveContainer" containerID="5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.036808 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.055369 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.065150 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.065243 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.065258 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.065279 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.065294 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.072463 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.087481 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.105032 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.134852 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"message\\\":\\\"dminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.836971 6193 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.837114 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837178 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837345 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837423 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837776 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837870 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.838904 6193 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:17:54.839448 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.149831 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.163748 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.167866 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.167920 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.167933 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.167952 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.167969 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.183047 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.197947 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.210853 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.223885 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.239398 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.253582 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:55Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.270621 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.270667 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.270680 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.270697 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.270706 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.377647 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.377739 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.377760 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.377790 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.377810 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.481779 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.481851 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.481870 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.481896 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.481915 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.585183 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.585322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.585340 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.585368 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.585386 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.688763 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.688805 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.688819 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.688840 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.688853 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.760280 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:55 crc kubenswrapper[4882]: E1002 16:17:55.760440 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.760302 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:55 crc kubenswrapper[4882]: E1002 16:17:55.760525 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.760280 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:55 crc kubenswrapper[4882]: E1002 16:17:55.760581 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.791742 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.791785 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.791796 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.791811 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.791822 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.894587 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.894642 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.894656 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.894682 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.894696 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.997370 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.997406 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.997417 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.997435 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:55 crc kubenswrapper[4882]: I1002 16:17:55.997447 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:55Z","lastTransitionTime":"2025-10-02T16:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.020520 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/0.log" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.023573 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.023716 4882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.037607 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.049290 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.065402 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.079255 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.090091 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.100272 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.100317 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.100327 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.100346 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.100358 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.102101 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.114774 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.127111 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.139299 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.159036 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.171603 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.186683 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.202724 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.202779 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.202790 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.202811 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.202823 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.206311 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"message\\\":\\\"dminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.836971 6193 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.837114 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837178 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837345 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837423 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837776 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837870 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.838904 6193 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:17:54.839448 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.221016 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.305710 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.305780 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.305802 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.305829 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.305849 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.408891 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.408967 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.408987 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.409015 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.409039 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.439483 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.455272 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.471148 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.487058 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.501104 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.512571 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.512629 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.512643 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.512665 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.512677 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.519235 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.536185 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.552018 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.574392 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.592026 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.605578 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.616109 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.616160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.616172 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.616193 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.616205 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.626125 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.646507 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"message\\\":\\\"dminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.836971 6193 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.837114 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837178 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837345 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837423 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837776 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837870 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.838904 6193 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:17:54.839448 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.667737 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.684109 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.718794 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.718838 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.718854 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.718874 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.718885 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.745687 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp"] Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.746460 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.749187 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.750880 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.771383 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"message\\\":\\\"dminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.836971 6193 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.837114 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837178 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837345 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837423 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837776 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837870 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.838904 6193 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:17:54.839448 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.795024 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.812150 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.821634 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.821688 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.821699 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.821723 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.821737 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.829345 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.842325 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.858920 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.871577 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.882178 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.884691 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.884756 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz6bn\" (UniqueName: \"kubernetes.io/projected/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-kube-api-access-vz6bn\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.884785 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.884854 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.895688 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.907947 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.924815 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.925607 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.925681 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.925698 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.925717 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.925729 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:56Z","lastTransitionTime":"2025-10-02T16:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.939170 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.952090 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.964507 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.979606 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.986153 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz6bn\" (UniqueName: \"kubernetes.io/projected/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-kube-api-access-vz6bn\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.986235 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.986287 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.986345 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.987175 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.987246 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:56 crc kubenswrapper[4882]: I1002 16:17:56.993071 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.003048 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz6bn\" (UniqueName: \"kubernetes.io/projected/8e41edd9-d556-45f0-b911-a7d65ecc7ce0-kube-api-access-vz6bn\") pod \"ovnkube-control-plane-749d76644c-bxbkp\" (UID: \"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.028710 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.028755 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.028765 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.028964 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.028977 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.030756 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/1.log" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.031910 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/0.log" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.036066 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d" exitCode=1 Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.036118 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.036164 4882 scope.go:117] "RemoveContainer" containerID="5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.037326 4882 scope.go:117] "RemoveContainer" containerID="25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d" Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.037624 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.052741 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.061144 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.067479 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.081780 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: W1002 16:17:57.083292 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e41edd9_d556_45f0_b911_a7d65ecc7ce0.slice/crio-39ae4008939367b07313ba7de1343aa273141193a980825b53269c375c4f571a WatchSource:0}: Error finding container 39ae4008939367b07313ba7de1343aa273141193a980825b53269c375c4f571a: Status 404 returned error can't find the container with id 39ae4008939367b07313ba7de1343aa273141193a980825b53269c375c4f571a Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.097788 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.115715 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.132413 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.132473 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.132487 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.132508 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.132521 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.144174 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"message\\\":\\\"dminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.836971 6193 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.837114 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837178 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837345 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837423 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837776 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837870 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.838904 6193 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:17:54.839448 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"3] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1002 16:17:56.154473 6342 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 16:17:56.154496 6342 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1002 16:17:56.154502 6342 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1002 16:17:56.154513 6342 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-5wxmw\\\\nI1002 16:17:56.154516 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.159534 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.171875 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.183646 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.194129 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.212800 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.231406 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.236316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.236368 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.236379 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.236417 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.236429 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.245577 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.258576 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.273716 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.340061 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.340145 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.340173 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.340206 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.340277 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.389980 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.390036 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.390126 4882 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.390188 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:18:13.390172486 +0000 UTC m=+52.139402013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.390422 4882 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.390646 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:18:13.390603016 +0000 UTC m=+52.139832583 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.443341 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.443397 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.443406 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.443424 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.443439 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.493190 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.493267 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.493290 4882 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.493380 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 16:18:13.493353394 +0000 UTC m=+52.242582961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.493865 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.493991 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.494368 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.494404 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.494470 4882 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.494586 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 16:18:13.494567293 +0000 UTC m=+52.243796850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.546919 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.546974 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.546984 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.547003 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.547015 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.651122 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.651202 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.651273 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.651300 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.651349 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.698108 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.698486 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:18:13.698408249 +0000 UTC m=+52.447637806 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.754535 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.754589 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.754602 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.754622 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.754637 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.760180 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.760278 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.760336 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.760456 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.760660 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.760787 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.858166 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.858269 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.858289 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.858316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.858334 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.961830 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.961913 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.961932 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.961960 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.961978 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:57 crc kubenswrapper[4882]: E1002 16:17:57.982452 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:57Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.987193 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.987273 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.987287 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.987309 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:57 crc kubenswrapper[4882]: I1002 16:17:57.987358 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:57Z","lastTransitionTime":"2025-10-02T16:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.005645 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.010947 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.011001 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.011016 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.011037 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.011049 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.033020 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.038306 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.038357 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.038376 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.038402 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.038423 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.042129 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" event={"ID":"8e41edd9-d556-45f0-b911-a7d65ecc7ce0","Type":"ContainerStarted","Data":"1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.042188 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" event={"ID":"8e41edd9-d556-45f0-b911-a7d65ecc7ce0","Type":"ContainerStarted","Data":"3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.042227 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" event={"ID":"8e41edd9-d556-45f0-b911-a7d65ecc7ce0","Type":"ContainerStarted","Data":"39ae4008939367b07313ba7de1343aa273141193a980825b53269c375c4f571a"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.044017 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/1.log" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.047952 4882 scope.go:117] "RemoveContainer" containerID="25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d" Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.048145 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.056502 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.067048 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.069752 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.069809 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.069829 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.069855 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.069873 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.086548 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.089017 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.089179 4882 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.093631 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.093886 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.094253 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.094386 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.094523 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.103472 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.118516 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.139275 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.158051 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.174365 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.190242 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.197660 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.197840 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.197952 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.198126 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.198251 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.208263 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.230115 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.248796 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.264968 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6ldvk"] Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.267153 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.267592 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.268358 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.284375 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.301637 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.301682 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.301697 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.301718 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.301733 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.309048 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.338890 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5db098d6beea9f90170532202c22c1c7ca82599a888201c8c19ccfe757f3ac72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"message\\\":\\\"dminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.836971 6193 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1002 16:17:54.837114 6193 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837178 6193 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837345 6193 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837423 6193 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837776 6193 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.837870 6193 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:17:54.838904 6193 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:17:54.839448 6193 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"3] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1002 16:17:56.154473 6342 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 16:17:56.154496 6342 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1002 16:17:56.154502 6342 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1002 16:17:56.154513 6342 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-5wxmw\\\\nI1002 16:17:56.154516 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.355569 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.373788 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.390282 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.404334 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.404403 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.404417 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.404439 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.404458 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.405169 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvplx\" (UniqueName: \"kubernetes.io/projected/9f988cab-7579-4a12-8df6-e3e91e42f7df-kube-api-access-tvplx\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.405298 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.411718 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.425985 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.446555 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.477920 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"3] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1002 16:17:56.154473 6342 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 16:17:56.154496 6342 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1002 16:17:56.154502 6342 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1002 16:17:56.154513 6342 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-5wxmw\\\\nI1002 16:17:56.154516 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.501763 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.506822 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.506876 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvplx\" (UniqueName: \"kubernetes.io/projected/9f988cab-7579-4a12-8df6-e3e91e42f7df-kube-api-access-tvplx\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.507121 4882 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:17:58 crc kubenswrapper[4882]: E1002 16:17:58.507293 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs podName:9f988cab-7579-4a12-8df6-e3e91e42f7df nodeName:}" failed. No retries permitted until 2025-10-02 16:17:59.007247488 +0000 UTC m=+37.756477065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs") pod "network-metrics-daemon-6ldvk" (UID: "9f988cab-7579-4a12-8df6-e3e91e42f7df") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.509039 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.509142 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.509163 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.509247 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.509269 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.518151 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.533460 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.536878 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvplx\" (UniqueName: \"kubernetes.io/projected/9f988cab-7579-4a12-8df6-e3e91e42f7df-kube-api-access-tvplx\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.546779 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.565002 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.579646 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.596516 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.612362 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.612444 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.612457 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.612480 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.612494 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.613696 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.629236 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:58Z is after 2025-08-24T17:21:41Z" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.718144 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.718543 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.718609 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.718680 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.718749 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.822480 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.822564 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.822586 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.822614 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.822632 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.925460 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.925542 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.925556 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.925578 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:58 crc kubenswrapper[4882]: I1002 16:17:58.925590 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:58Z","lastTransitionTime":"2025-10-02T16:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.011814 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:17:59 crc kubenswrapper[4882]: E1002 16:17:59.011999 4882 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:17:59 crc kubenswrapper[4882]: E1002 16:17:59.012079 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs podName:9f988cab-7579-4a12-8df6-e3e91e42f7df nodeName:}" failed. No retries permitted until 2025-10-02 16:18:00.012056567 +0000 UTC m=+38.761286094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs") pod "network-metrics-daemon-6ldvk" (UID: "9f988cab-7579-4a12-8df6-e3e91e42f7df") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.031107 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.031784 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.032033 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.032314 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.032582 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.137460 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.137547 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.137570 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.137605 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.137626 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.242061 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.242142 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.242165 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.242194 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.242249 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.345525 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.345595 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.345617 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.345647 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.345665 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.449680 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.450145 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.450164 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.450193 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.450249 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.554049 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.554140 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.554164 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.554198 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.554259 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.657685 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.657746 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.657760 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.657801 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.657819 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.759417 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.759430 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.759464 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.759414 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:17:59 crc kubenswrapper[4882]: E1002 16:17:59.759560 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:17:59 crc kubenswrapper[4882]: E1002 16:17:59.759781 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:17:59 crc kubenswrapper[4882]: E1002 16:17:59.759916 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:17:59 crc kubenswrapper[4882]: E1002 16:17:59.760049 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.760845 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.760889 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.760908 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.760932 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.760950 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.866313 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.866374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.866385 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.866404 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.866417 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.969842 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.969880 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.969891 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.969914 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:17:59 crc kubenswrapper[4882]: I1002 16:17:59.969926 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:17:59Z","lastTransitionTime":"2025-10-02T16:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.025259 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:00 crc kubenswrapper[4882]: E1002 16:18:00.025421 4882 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:00 crc kubenswrapper[4882]: E1002 16:18:00.025488 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs podName:9f988cab-7579-4a12-8df6-e3e91e42f7df nodeName:}" failed. No retries permitted until 2025-10-02 16:18:02.02547197 +0000 UTC m=+40.774701497 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs") pod "network-metrics-daemon-6ldvk" (UID: "9f988cab-7579-4a12-8df6-e3e91e42f7df") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.072729 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.072790 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.072808 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.072835 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.072855 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.176509 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.176550 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.176561 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.176580 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.176594 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.280318 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.280394 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.280413 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.280441 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.280462 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.383750 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.383796 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.383805 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.383823 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.383841 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.488103 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.488180 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.488203 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.488281 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.488307 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.591687 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.591755 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.591773 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.591802 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.591820 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.695456 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.695512 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.695524 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.695550 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.695564 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.798494 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.798544 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.798562 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.798591 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.798611 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.908836 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.908872 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.908881 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.908898 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:00 crc kubenswrapper[4882]: I1002 16:18:00.908908 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:00Z","lastTransitionTime":"2025-10-02T16:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.011959 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.012001 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.012015 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.012031 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.012043 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.114953 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.115003 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.115018 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.115039 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.115053 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.217246 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.217297 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.217311 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.217332 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.217347 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.319631 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.319686 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.319698 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.319716 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.319730 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.422865 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.423135 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.423196 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.423274 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.423333 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.525647 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.525978 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.526068 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.526166 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.526307 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.629629 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.629677 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.629690 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.629710 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.629725 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.733209 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.733320 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.733340 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.733370 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.733392 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.759934 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.760001 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.760092 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:01 crc kubenswrapper[4882]: E1002 16:18:01.760199 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:01 crc kubenswrapper[4882]: E1002 16:18:01.760343 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:01 crc kubenswrapper[4882]: E1002 16:18:01.760497 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.760611 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:01 crc kubenswrapper[4882]: E1002 16:18:01.760832 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.837363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.837899 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.838089 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.838311 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.838578 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.941918 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.941992 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.942016 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.942050 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:01 crc kubenswrapper[4882]: I1002 16:18:01.942076 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:01Z","lastTransitionTime":"2025-10-02T16:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.045130 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.045191 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.045237 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.045266 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.045284 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.051392 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:02 crc kubenswrapper[4882]: E1002 16:18:02.051702 4882 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:02 crc kubenswrapper[4882]: E1002 16:18:02.051887 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs podName:9f988cab-7579-4a12-8df6-e3e91e42f7df nodeName:}" failed. No retries permitted until 2025-10-02 16:18:06.051863012 +0000 UTC m=+44.801092539 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs") pod "network-metrics-daemon-6ldvk" (UID: "9f988cab-7579-4a12-8df6-e3e91e42f7df") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.148035 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.148395 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.148512 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.148630 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.148732 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.251363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.251663 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.251770 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.251892 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.252103 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.356122 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.356730 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.356825 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.356939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.357056 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.461357 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.461796 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.461905 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.462017 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.462116 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.565814 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.565862 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.565876 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.565897 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.565910 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.669401 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.669476 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.669489 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.669509 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.669518 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.772562 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.772621 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.772631 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.772650 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.772663 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.781890 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.800907 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.818244 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.839680 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.853552 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.873104 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.874851 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.875247 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.875350 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.875630 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.875959 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.893147 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"3] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1002 16:17:56.154473 6342 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 16:17:56.154496 6342 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1002 16:17:56.154502 6342 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1002 16:17:56.154513 6342 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-5wxmw\\\\nI1002 16:17:56.154516 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.909049 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.922587 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.937509 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.951255 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.969037 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.980522 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.980573 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.980581 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.980598 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.980609 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:02Z","lastTransitionTime":"2025-10-02T16:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.983304 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:02 crc kubenswrapper[4882]: I1002 16:18:02.997821 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.013929 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:03Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.027662 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:03Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.083427 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.083812 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.083934 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.084069 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.084165 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.188281 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.188591 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.188677 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.188766 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.188863 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.292596 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.292963 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.293074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.293163 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.293262 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.396413 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.396457 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.396467 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.396488 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.396502 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.499822 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.500140 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.500204 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.500347 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.500431 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.603695 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.603735 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.603744 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.603766 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.603778 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.706974 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.707017 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.707026 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.707043 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.707053 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.759321 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.759359 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.759353 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.760043 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:03 crc kubenswrapper[4882]: E1002 16:18:03.760241 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:03 crc kubenswrapper[4882]: E1002 16:18:03.760365 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:03 crc kubenswrapper[4882]: E1002 16:18:03.760442 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:03 crc kubenswrapper[4882]: E1002 16:18:03.760494 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.809943 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.810004 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.810017 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.810039 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.810056 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.913265 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.913315 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.913325 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.913343 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:03 crc kubenswrapper[4882]: I1002 16:18:03.913354 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:03Z","lastTransitionTime":"2025-10-02T16:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.016891 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.017363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.017627 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.017832 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.017987 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.120995 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.121039 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.121049 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.121066 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.121076 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.224769 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.224834 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.224853 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.224913 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.224933 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.327840 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.327921 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.327941 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.327970 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.327992 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.430497 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.430543 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.430556 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.430575 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.430591 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.533461 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.533509 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.533526 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.533548 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.533564 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.635799 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.635832 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.635841 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.635857 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.635866 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.738999 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.739057 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.739073 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.739099 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.739131 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.842919 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.843539 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.843846 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.844035 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.844198 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.947143 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.947190 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.947202 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.947236 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:04 crc kubenswrapper[4882]: I1002 16:18:04.947252 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:04Z","lastTransitionTime":"2025-10-02T16:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.050252 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.050306 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.050321 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.050340 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.050355 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.152975 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.153015 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.153025 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.153083 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.153093 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.256829 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.257464 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.257668 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.257802 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.257954 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.361628 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.361677 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.361694 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.361717 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.361733 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.465719 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.465782 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.465796 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.465818 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.465832 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.569393 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.569485 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.569507 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.569533 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.569552 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.672542 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.672856 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.672990 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.673064 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.673126 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.759900 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.759911 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.759974 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.760024 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:05 crc kubenswrapper[4882]: E1002 16:18:05.760690 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:05 crc kubenswrapper[4882]: E1002 16:18:05.760858 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:05 crc kubenswrapper[4882]: E1002 16:18:05.760962 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:05 crc kubenswrapper[4882]: E1002 16:18:05.761138 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.776057 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.776361 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.776433 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.776507 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.776615 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.879946 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.880281 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.880376 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.880468 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.880587 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.983836 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.983896 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.983912 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.983934 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:05 crc kubenswrapper[4882]: I1002 16:18:05.983954 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:05Z","lastTransitionTime":"2025-10-02T16:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.086828 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.086893 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.086908 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.086927 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.086941 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.093726 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:06 crc kubenswrapper[4882]: E1002 16:18:06.093932 4882 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:06 crc kubenswrapper[4882]: E1002 16:18:06.094031 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs podName:9f988cab-7579-4a12-8df6-e3e91e42f7df nodeName:}" failed. No retries permitted until 2025-10-02 16:18:14.09400647 +0000 UTC m=+52.843236037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs") pod "network-metrics-daemon-6ldvk" (UID: "9f988cab-7579-4a12-8df6-e3e91e42f7df") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.197530 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.197981 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.198149 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.198336 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.198501 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.301557 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.301632 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.301653 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.301680 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.301699 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.405296 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.405640 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.405703 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.405767 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.405823 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.508574 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.508650 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.508671 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.508703 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.508722 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.611538 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.611683 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.611699 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.611760 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.611774 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.715470 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.715505 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.715516 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.715531 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.715540 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.818609 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.818726 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.818742 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.818762 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.818775 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.921821 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.921879 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.921903 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.921939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:06 crc kubenswrapper[4882]: I1002 16:18:06.921963 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:06Z","lastTransitionTime":"2025-10-02T16:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.024404 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.024901 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.025056 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.025250 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.025441 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.128146 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.128532 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.128665 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.128749 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.128827 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.232129 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.232199 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.232265 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.232297 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.232321 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.336185 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.336697 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.336882 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.337070 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.337330 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.440268 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.440337 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.440402 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.440432 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.440455 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.473575 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.475033 4882 scope.go:117] "RemoveContainer" containerID="25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.543798 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.543854 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.543862 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.543882 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.543896 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.646951 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.648285 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.648544 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.649038 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.649448 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.752356 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.752381 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.752391 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.752406 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.752418 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.759878 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:07 crc kubenswrapper[4882]: E1002 16:18:07.759993 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.760422 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:07 crc kubenswrapper[4882]: E1002 16:18:07.760588 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.760657 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:07 crc kubenswrapper[4882]: E1002 16:18:07.760701 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.760757 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:07 crc kubenswrapper[4882]: E1002 16:18:07.760825 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.855488 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.855554 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.855619 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.855651 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.855669 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.959968 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.960020 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.960030 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.960052 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:07 crc kubenswrapper[4882]: I1002 16:18:07.960066 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:07Z","lastTransitionTime":"2025-10-02T16:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.063145 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.063179 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.063190 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.063205 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.063230 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.109115 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/1.log" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.111206 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.112419 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.137630 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.150503 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.166313 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.166363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.166373 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.166388 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.166398 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.170347 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.232556 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"3] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1002 16:17:56.154473 6342 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 16:17:56.154496 6342 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1002 16:17:56.154502 6342 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1002 16:17:56.154513 6342 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-5wxmw\\\\nI1002 16:17:56.154516 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.255314 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.269264 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.269324 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.269334 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.269353 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.269362 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.275877 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.291971 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.307541 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.320796 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.333065 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.346361 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.364303 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.371848 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.372037 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.372157 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.372252 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.372330 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.378348 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.396098 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.413103 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.417524 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.417582 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.417596 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.417614 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.417627 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.431448 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: E1002 16:18:08.433567 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.437725 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.437762 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.437772 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.437788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.437797 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: E1002 16:18:08.451008 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.455474 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.455530 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.455548 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.455566 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.455575 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: E1002 16:18:08.470433 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.475377 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.475430 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.475444 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.475464 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.475477 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: E1002 16:18:08.490360 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.495240 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.495291 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.495303 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.495324 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.495335 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: E1002 16:18:08.511502 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:08Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:08 crc kubenswrapper[4882]: E1002 16:18:08.511693 4882 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.513848 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.513890 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.513916 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.513939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.513959 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.616653 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.617485 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.617572 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.617654 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.617747 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.720564 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.720616 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.720626 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.720644 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.720654 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.822811 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.822865 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.822886 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.822906 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.822921 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.925990 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.926046 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.926058 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.926076 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:08 crc kubenswrapper[4882]: I1002 16:18:08.926090 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:08Z","lastTransitionTime":"2025-10-02T16:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.029577 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.029626 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.029636 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.029652 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.029662 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.116873 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/2.log" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.117509 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/1.log" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.120249 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c" exitCode=1 Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.120310 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.120370 4882 scope.go:117] "RemoveContainer" containerID="25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.121059 4882 scope.go:117] "RemoveContainer" containerID="31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c" Oct 02 16:18:09 crc kubenswrapper[4882]: E1002 16:18:09.121310 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.132118 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.132168 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.132182 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.132199 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.132230 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.133563 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.149729 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.170860 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"3] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1002 16:17:56.154473 6342 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 16:17:56.154496 6342 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1002 16:17:56.154502 6342 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1002 16:17:56.154513 6342 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-5wxmw\\\\nI1002 16:17:56.154516 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.186511 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.200836 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.214083 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.225514 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.235882 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.235920 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.235929 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.235946 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.235957 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.237622 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.246909 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.259378 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.272174 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.289179 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.308179 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.322978 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.337082 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.339091 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.339125 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.339137 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.339158 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.339171 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.351878 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:09Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.442365 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.442410 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.442421 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.442439 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.442450 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.544591 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.544635 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.544682 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.544700 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.544711 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.647124 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.648136 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.648160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.648182 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.648195 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.751734 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.751793 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.751805 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.751826 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.751839 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.760017 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.760017 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:09 crc kubenswrapper[4882]: E1002 16:18:09.760540 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:09 crc kubenswrapper[4882]: E1002 16:18:09.760717 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.760030 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:09 crc kubenswrapper[4882]: E1002 16:18:09.760849 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.760188 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:09 crc kubenswrapper[4882]: E1002 16:18:09.760943 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.856749 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.857091 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.857102 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.857118 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.857128 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.960919 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.960973 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.960984 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.961008 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:09 crc kubenswrapper[4882]: I1002 16:18:09.961019 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:09Z","lastTransitionTime":"2025-10-02T16:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.063712 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.063795 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.063812 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.063837 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.063854 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.081113 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.090718 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.099047 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.120970 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.127950 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/2.log" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.133403 4882 scope.go:117] "RemoveContainer" containerID="31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c" Oct 02 16:18:10 crc kubenswrapper[4882]: E1002 16:18:10.133631 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.142018 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.156186 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.167191 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.167318 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.167332 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.167351 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.167363 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.171505 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.183400 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.201552 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.226882 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25867b6466fc3458c4a22cffd37eadff9b74972d7c3b325c4eb27a138a2c716d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"message\\\":\\\"3] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nF1002 16:17:56.154473 6342 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:17:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 16:17:56.154496 6342 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1002 16:17:56.154502 6342 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1002 16:17:56.154513 6342 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-5wxmw\\\\nI1002 16:17:56.154516 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.238458 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.250899 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.265206 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.270324 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.270360 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.270370 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.270390 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.270404 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.281715 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.300362 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.316257 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.331585 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.343767 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.361466 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.373375 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.373421 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.373439 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.373457 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.373470 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.377487 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.392141 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.407040 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.421093 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.435355 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.449523 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.464611 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.476732 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.476783 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.476792 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.476811 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.476826 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.482289 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.498766 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.512709 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.533362 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.555190 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.572119 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.579127 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.579183 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.579196 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.579235 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.579256 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.597608 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.612242 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.632001 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:10Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.683175 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.683267 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.683279 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.683323 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.683338 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.786444 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.786492 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.786508 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.786528 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.786542 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.897709 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.897766 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.897779 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.897799 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:10 crc kubenswrapper[4882]: I1002 16:18:10.897809 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:10Z","lastTransitionTime":"2025-10-02T16:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.000457 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.000507 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.000527 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.000556 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.000573 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.103250 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.103299 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.103312 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.103344 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.103357 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.206480 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.206537 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.206551 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.206569 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.206581 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.310018 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.310089 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.310110 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.310137 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.310154 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.413081 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.413174 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.413197 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.413294 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.413350 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.517140 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.517249 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.517269 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.517295 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.517317 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.621033 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.621127 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.621152 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.621190 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.621250 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.724508 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.724594 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.724614 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.724662 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.724703 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.759434 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.759480 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.759568 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.759605 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:11 crc kubenswrapper[4882]: E1002 16:18:11.759673 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:11 crc kubenswrapper[4882]: E1002 16:18:11.759787 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:11 crc kubenswrapper[4882]: E1002 16:18:11.760069 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:11 crc kubenswrapper[4882]: E1002 16:18:11.760279 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.828034 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.828090 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.828138 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.828160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.828178 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.931533 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.931599 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.931610 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.931630 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:11 crc kubenswrapper[4882]: I1002 16:18:11.931644 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:11Z","lastTransitionTime":"2025-10-02T16:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.035195 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.035268 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.035278 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.035297 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.035311 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.138789 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.138834 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.138848 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.138866 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.138878 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.243066 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.243150 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.243169 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.243197 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.243240 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.346280 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.346330 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.346344 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.346363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.346382 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.449348 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.449515 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.449535 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.449565 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.449586 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.553329 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.553401 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.553420 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.553446 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.553465 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.656965 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.657052 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.657079 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.657113 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.657142 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.760287 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.760546 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.760563 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.760590 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.760610 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.781707 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.796866 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.813248 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.829275 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.843768 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.857316 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.862616 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.862690 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.862703 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.862723 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.862738 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.883482 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.900659 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.915908 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.932392 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.948006 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.961107 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.968555 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.968628 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.968645 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.968672 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.968689 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:12Z","lastTransitionTime":"2025-10-02T16:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:12 crc kubenswrapper[4882]: I1002 16:18:12.980020 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:12Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.017803 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:13Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.041570 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:13Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.058076 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:13Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.071169 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.071206 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.071239 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.071259 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.071269 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.071777 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:13Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.173781 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.173859 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.173870 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.173888 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.173899 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.277385 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.278349 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.278436 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.278473 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.278496 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.381780 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.381847 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.381866 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.381895 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.381915 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.481603 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.481700 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.481906 4882 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.481905 4882 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.482001 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:18:45.48197546 +0000 UTC m=+84.231205027 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.482030 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:18:45.482016621 +0000 UTC m=+84.231246178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.484559 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.484614 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.484629 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.484675 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.484691 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.583198 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.583353 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.583425 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.583465 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.583479 4882 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.583555 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 16:18:45.583534328 +0000 UTC m=+84.332763855 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.583634 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.583667 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.583688 4882 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.583778 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 16:18:45.583751133 +0000 UTC m=+84.332980690 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.589664 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.589756 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.589782 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.589815 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.589833 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.692450 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.692496 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.692508 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.692527 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.692541 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.760048 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.760235 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.760445 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.760503 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.760502 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.760569 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.760721 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.760859 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.786189 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:18:13 crc kubenswrapper[4882]: E1002 16:18:13.786389 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:18:45.786362698 +0000 UTC m=+84.535592245 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.794688 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.794725 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.794737 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.794758 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.794771 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.897075 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.897114 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.897124 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.897140 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:13 crc kubenswrapper[4882]: I1002 16:18:13.897150 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:13Z","lastTransitionTime":"2025-10-02T16:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.000280 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.000355 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.000374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.000404 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.000426 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.103899 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.103975 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.103989 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.104043 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.104058 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.190061 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:14 crc kubenswrapper[4882]: E1002 16:18:14.190259 4882 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:14 crc kubenswrapper[4882]: E1002 16:18:14.190327 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs podName:9f988cab-7579-4a12-8df6-e3e91e42f7df nodeName:}" failed. No retries permitted until 2025-10-02 16:18:30.190304296 +0000 UTC m=+68.939533833 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs") pod "network-metrics-daemon-6ldvk" (UID: "9f988cab-7579-4a12-8df6-e3e91e42f7df") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.207395 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.207479 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.207507 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.207539 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.207563 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.310587 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.310643 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.310653 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.310667 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.310677 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.412832 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.412871 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.412880 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.412895 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.412905 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.515862 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.515918 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.515931 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.515951 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.515964 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.619552 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.619594 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.619603 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.619621 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.619633 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.722604 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.722654 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.722662 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.722684 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.722696 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.826900 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.826975 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.827001 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.827034 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.827059 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.930681 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.930750 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.930772 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.930809 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:14 crc kubenswrapper[4882]: I1002 16:18:14.930838 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:14Z","lastTransitionTime":"2025-10-02T16:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.033630 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.033722 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.033748 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.033781 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.033806 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.136881 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.136956 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.136977 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.137007 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.137028 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.240756 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.240845 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.240868 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.240897 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.240916 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.344493 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.344536 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.344547 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.344566 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.344577 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.447051 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.447097 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.447108 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.447125 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.447138 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.550265 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.550321 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.550333 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.550357 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.550371 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.653443 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.653513 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.653528 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.653549 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.653564 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.756063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.756121 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.756132 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.756157 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.756168 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.759342 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.759395 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.759405 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:15 crc kubenswrapper[4882]: E1002 16:18:15.759461 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.759358 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:15 crc kubenswrapper[4882]: E1002 16:18:15.759552 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:15 crc kubenswrapper[4882]: E1002 16:18:15.759612 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:15 crc kubenswrapper[4882]: E1002 16:18:15.759835 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.859904 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.860009 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.860029 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.860056 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.860075 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.962927 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.962970 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.962984 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.963009 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:15 crc kubenswrapper[4882]: I1002 16:18:15.963023 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:15Z","lastTransitionTime":"2025-10-02T16:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.066334 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.066392 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.066404 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.066425 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.066437 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.169936 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.170010 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.170026 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.170055 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.170072 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.273838 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.273902 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.273914 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.273939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.273953 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.376937 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.377002 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.377025 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.377049 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.377063 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.480567 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.480635 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.480647 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.480664 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.480675 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.583327 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.583397 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.583416 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.583443 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.583458 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.686119 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.686159 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.686170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.686187 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.686197 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.789294 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.789336 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.789344 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.789359 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.789368 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.892259 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.892328 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.892345 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.892371 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.892388 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.994775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.994860 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.994880 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.994905 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:16 crc kubenswrapper[4882]: I1002 16:18:16.994924 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:16Z","lastTransitionTime":"2025-10-02T16:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.098683 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.098756 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.098777 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.098805 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.098823 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.203089 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.203156 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.203184 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.203245 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.203279 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.306457 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.306512 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.306523 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.306543 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.306554 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.409932 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.409996 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.410017 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.410042 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.410061 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.513554 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.513625 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.513643 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.513702 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.513724 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.617170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.617328 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.617356 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.617385 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.617402 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.720873 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.720924 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.720934 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.720950 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.720959 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.760089 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.760141 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.760111 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.760233 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:17 crc kubenswrapper[4882]: E1002 16:18:17.760357 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:17 crc kubenswrapper[4882]: E1002 16:18:17.760467 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:17 crc kubenswrapper[4882]: E1002 16:18:17.760560 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:17 crc kubenswrapper[4882]: E1002 16:18:17.760681 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.823972 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.824024 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.824037 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.824058 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.824071 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.927351 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.927424 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.927436 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.927537 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:17 crc kubenswrapper[4882]: I1002 16:18:17.927553 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:17Z","lastTransitionTime":"2025-10-02T16:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.030874 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.030943 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.030960 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.030986 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.031001 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.134555 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.134601 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.134616 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.134636 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.134650 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.237023 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.237080 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.237097 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.237120 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.237137 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.340463 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.340519 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.340534 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.340558 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.340572 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.442759 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.442804 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.442816 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.442837 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.442849 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.546324 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.546395 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.546407 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.546427 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.546454 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.642907 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.642996 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.643024 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.643057 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.643082 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: E1002 16:18:18.664494 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:18Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.669886 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.669914 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.669923 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.669939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.669948 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: E1002 16:18:18.682288 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:18Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.686898 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.686929 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.686939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.686953 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.686964 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: E1002 16:18:18.700550 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:18Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.705338 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.705374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.705383 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.705400 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.705412 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: E1002 16:18:18.717989 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:18Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.722594 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.722654 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.722666 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.722683 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.722695 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: E1002 16:18:18.735341 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:18Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:18 crc kubenswrapper[4882]: E1002 16:18:18.735574 4882 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.738148 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.738351 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.738436 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.738532 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.738633 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.842281 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.842614 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.842961 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.843145 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.843284 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.946814 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.947505 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.947558 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.947619 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:18 crc kubenswrapper[4882]: I1002 16:18:18.947639 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:18Z","lastTransitionTime":"2025-10-02T16:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.050958 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.051005 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.051015 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.051033 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.051044 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.153998 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.154064 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.154081 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.154101 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.154117 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.256860 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.256928 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.256945 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.256970 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.256983 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.359636 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.359695 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.359705 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.359724 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.359734 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.463706 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.463775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.463788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.463817 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.463835 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.566506 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.566572 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.566587 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.566607 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.566619 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.669943 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.670011 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.670029 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.670059 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.670077 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.759464 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.759580 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:19 crc kubenswrapper[4882]: E1002 16:18:19.759651 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.759715 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.759757 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:19 crc kubenswrapper[4882]: E1002 16:18:19.760008 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:19 crc kubenswrapper[4882]: E1002 16:18:19.760281 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:19 crc kubenswrapper[4882]: E1002 16:18:19.760374 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.773403 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.773479 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.773490 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.773508 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.773520 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.876524 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.876931 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.877037 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.877145 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.877267 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.980272 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.980341 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.980357 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.980383 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:19 crc kubenswrapper[4882]: I1002 16:18:19.980399 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:19Z","lastTransitionTime":"2025-10-02T16:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.082845 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.082913 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.082930 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.082955 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.082971 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.186294 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.186339 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.186352 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.186377 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.186396 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.289954 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.290008 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.290021 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.290042 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.290055 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.393017 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.393065 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.393077 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.393097 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.393110 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.496460 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.497031 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.497137 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.497254 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.497358 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.600380 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.600413 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.600423 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.600438 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.600448 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.702728 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.702768 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.702777 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.702795 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.702806 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.760501 4882 scope.go:117] "RemoveContainer" containerID="31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c" Oct 02 16:18:20 crc kubenswrapper[4882]: E1002 16:18:20.760692 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.805494 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.805545 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.805558 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.805578 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.805592 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.908096 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.908136 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.908145 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.908187 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:20 crc kubenswrapper[4882]: I1002 16:18:20.908200 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:20Z","lastTransitionTime":"2025-10-02T16:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.011574 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.012251 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.012269 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.012318 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.012358 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.115901 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.116344 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.116523 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.116693 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.116851 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.219899 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.219956 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.219969 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.219991 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.220003 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.323025 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.323416 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.323491 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.323562 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.323623 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.426062 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.426110 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.426119 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.426138 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.426153 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.529419 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.529756 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.529828 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.529904 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.529964 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.632477 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.632545 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.632559 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.632580 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.632591 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.734949 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.735305 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.735396 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.735474 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.735561 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.759288 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.759288 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:21 crc kubenswrapper[4882]: E1002 16:18:21.759522 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.759599 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:21 crc kubenswrapper[4882]: E1002 16:18:21.759680 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:21 crc kubenswrapper[4882]: E1002 16:18:21.759754 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.759892 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:21 crc kubenswrapper[4882]: E1002 16:18:21.760060 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.838241 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.838291 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.838303 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.838321 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.838335 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.940785 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.940826 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.940837 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.940864 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:21 crc kubenswrapper[4882]: I1002 16:18:21.940876 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:21Z","lastTransitionTime":"2025-10-02T16:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.043482 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.043523 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.043532 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.043549 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.043560 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.146805 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.146841 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.146853 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.146868 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.146879 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.250837 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.250938 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.250965 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.251001 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.251037 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.354647 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.354683 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.354693 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.354709 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.354721 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.457241 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.457289 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.457316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.457332 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.457343 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.560259 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.560313 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.560326 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.560347 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.560359 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.663745 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.664174 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.664414 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.664476 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.664495 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.767153 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.767607 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.767636 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.767657 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.767673 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.777626 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.792457 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.805577 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.816141 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.827118 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.840750 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.853765 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.868187 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.869914 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.869965 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.869981 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.870008 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.870021 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.883122 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.900556 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.915444 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.931147 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.947486 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.959305 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.973049 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.973100 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.973112 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.973130 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.973142 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:22Z","lastTransitionTime":"2025-10-02T16:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:22 crc kubenswrapper[4882]: I1002 16:18:22.979766 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.001339 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:22Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.013886 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:23Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.075794 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.075847 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.075862 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.075884 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.075928 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.178860 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.178910 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.178920 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.178942 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.178952 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.282506 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.282575 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.282592 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.282620 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.282640 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.387571 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.387715 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.387740 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.387770 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.387794 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.492238 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.492291 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.492303 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.492322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.492333 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.597054 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.597167 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.597196 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.597263 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.597291 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.700717 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.701184 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.701323 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.701385 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.701414 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.759381 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.759657 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.759512 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.759480 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:23 crc kubenswrapper[4882]: E1002 16:18:23.759828 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:23 crc kubenswrapper[4882]: E1002 16:18:23.759946 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:23 crc kubenswrapper[4882]: E1002 16:18:23.760089 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:23 crc kubenswrapper[4882]: E1002 16:18:23.760285 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.805631 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.805722 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.805748 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.805773 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.805792 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.908672 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.908721 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.908736 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.908756 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:23 crc kubenswrapper[4882]: I1002 16:18:23.908770 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:23Z","lastTransitionTime":"2025-10-02T16:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.012039 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.012101 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.012118 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.012145 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.012164 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.115051 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.115135 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.115150 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.115186 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.115199 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.218170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.218322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.218355 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.218402 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.218426 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.321608 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.321661 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.321673 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.321691 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.321701 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.425617 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.425701 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.425719 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.425751 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.425775 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.529031 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.529078 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.529090 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.529117 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.529130 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.633328 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.633412 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.633432 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.633461 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.633481 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.737335 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.737398 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.737415 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.737437 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.737451 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.841063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.841121 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.841139 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.841165 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.841182 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.944329 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.944375 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.944415 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.944434 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:24 crc kubenswrapper[4882]: I1002 16:18:24.944444 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:24Z","lastTransitionTime":"2025-10-02T16:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.047074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.047583 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.047732 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.047867 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.047993 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.151239 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.151289 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.151318 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.151338 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.151354 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.254863 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.254963 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.254984 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.255012 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.255031 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.359024 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.359086 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.359101 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.359125 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.359141 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.471362 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.471424 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.471436 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.471456 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.471471 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.574844 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.574896 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.574909 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.574931 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.574944 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.677936 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.677976 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.677989 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.678007 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.678021 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.759263 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:25 crc kubenswrapper[4882]: E1002 16:18:25.759449 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.759906 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:25 crc kubenswrapper[4882]: E1002 16:18:25.759960 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.759995 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:25 crc kubenswrapper[4882]: E1002 16:18:25.760039 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.760073 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:25 crc kubenswrapper[4882]: E1002 16:18:25.760114 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.781227 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.781258 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.781267 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.781286 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.781296 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.884182 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.884252 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.884261 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.884304 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.884342 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.987038 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.987341 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.987466 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.987595 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:25 crc kubenswrapper[4882]: I1002 16:18:25.987743 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:25Z","lastTransitionTime":"2025-10-02T16:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.090197 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.090253 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.090287 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.090305 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.090319 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.193971 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.194058 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.194090 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.194122 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.194145 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.297109 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.297180 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.297199 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.297249 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.297269 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.401178 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.402606 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.402685 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.402735 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.402778 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.505638 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.505691 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.505700 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.505719 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.505733 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.608509 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.608560 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.608571 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.608590 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.608603 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.712535 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.713483 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.713598 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.713700 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.713798 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.817209 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.817282 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.817295 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.817316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.817329 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.920404 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.920451 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.920462 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.920483 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:26 crc kubenswrapper[4882]: I1002 16:18:26.920495 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:26Z","lastTransitionTime":"2025-10-02T16:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.023478 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.023537 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.023551 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.023571 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.023584 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.127011 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.127076 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.127090 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.127112 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.127124 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.230098 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.230143 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.230151 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.230170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.230180 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.332589 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.332656 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.332669 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.332687 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.332700 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.436596 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.436647 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.436660 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.436680 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.436692 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.540006 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.540074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.540086 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.540108 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.540124 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.643292 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.643395 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.643414 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.643446 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.643471 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.746331 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.746391 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.746405 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.746428 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.746444 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.759659 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:27 crc kubenswrapper[4882]: E1002 16:18:27.759822 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.759817 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.759883 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.759917 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:27 crc kubenswrapper[4882]: E1002 16:18:27.759976 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:27 crc kubenswrapper[4882]: E1002 16:18:27.760159 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:27 crc kubenswrapper[4882]: E1002 16:18:27.760314 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.849241 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.849294 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.849306 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.849326 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.849343 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.952528 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.952576 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.952587 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.952604 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:27 crc kubenswrapper[4882]: I1002 16:18:27.952618 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:27Z","lastTransitionTime":"2025-10-02T16:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.055732 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.055777 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.055789 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.055808 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.055822 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.159579 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.159636 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.159652 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.159675 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.159690 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.263383 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.263441 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.263451 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.263471 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.263484 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.367049 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.367123 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.367143 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.367170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.367188 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.469718 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.469784 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.469801 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.469824 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.469838 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.572825 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.572890 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.572900 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.572918 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.572929 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.676259 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.676297 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.676306 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.676321 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.676332 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.778864 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.779292 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.779400 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.779493 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.779573 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.882998 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.883057 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.883067 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.883087 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.883097 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.969950 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.970599 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.970851 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.971065 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.971330 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:28 crc kubenswrapper[4882]: E1002 16:18:28.991781 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:28Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.998200 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.998264 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.998277 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.998296 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:28 crc kubenswrapper[4882]: I1002 16:18:28.998308 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:28Z","lastTransitionTime":"2025-10-02T16:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.014315 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.019955 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.020182 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.020374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.020520 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.020653 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.041650 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.047382 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.047414 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.047426 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.047445 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.047459 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.067656 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.073166 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.073203 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.073240 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.073258 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.073273 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.089198 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:29Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.089350 4882 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.091731 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.091796 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.091812 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.091839 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.091854 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.194795 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.194845 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.194856 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.194873 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.194885 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.298056 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.298098 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.298111 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.298131 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.298146 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.401896 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.401958 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.401977 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.402004 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.402021 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.505160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.505207 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.505238 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.505258 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.505269 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.608263 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.608322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.608338 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.608361 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.608390 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.711570 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.711643 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.711667 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.711700 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.711720 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.761473 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.761525 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.761578 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.761534 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.761658 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.761790 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.761853 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:29 crc kubenswrapper[4882]: E1002 16:18:29.761957 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.815597 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.815668 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.815678 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.815696 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.815709 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.919331 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.919363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.919375 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.919393 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:29 crc kubenswrapper[4882]: I1002 16:18:29.919442 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:29Z","lastTransitionTime":"2025-10-02T16:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.022594 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.022652 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.022670 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.022695 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.022710 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.125641 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.125685 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.125696 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.125717 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.125732 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.228881 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.228949 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.228962 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.228985 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.228998 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.275179 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:30 crc kubenswrapper[4882]: E1002 16:18:30.275485 4882 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:30 crc kubenswrapper[4882]: E1002 16:18:30.275634 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs podName:9f988cab-7579-4a12-8df6-e3e91e42f7df nodeName:}" failed. No retries permitted until 2025-10-02 16:19:02.27559893 +0000 UTC m=+101.024828487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs") pod "network-metrics-daemon-6ldvk" (UID: "9f988cab-7579-4a12-8df6-e3e91e42f7df") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.333781 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.333854 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.333873 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.333905 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.333924 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.437922 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.437979 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.437992 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.438012 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.438024 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.541506 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.541575 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.541590 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.541614 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.541630 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.644705 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.644762 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.644771 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.644793 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.644809 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.747629 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.747674 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.747684 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.747704 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.747716 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.850794 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.850832 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.850841 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.850857 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.850868 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.954462 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.954546 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.954566 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.954594 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:30 crc kubenswrapper[4882]: I1002 16:18:30.954613 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:30Z","lastTransitionTime":"2025-10-02T16:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.058132 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.058203 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.058240 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.058263 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.058278 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.160612 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.160657 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.160666 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.160685 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.160696 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.263967 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.264028 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.264044 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.264066 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.264082 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.367631 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.367684 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.367697 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.367716 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.367727 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.470662 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.470714 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.470725 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.470744 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.470755 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.573600 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.573655 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.573671 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.573741 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.573770 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.677139 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.677209 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.677404 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.677515 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.677536 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.759483 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:31 crc kubenswrapper[4882]: E1002 16:18:31.760123 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.759937 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:31 crc kubenswrapper[4882]: E1002 16:18:31.760702 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.759890 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:31 crc kubenswrapper[4882]: E1002 16:18:31.760927 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.759974 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:31 crc kubenswrapper[4882]: E1002 16:18:31.761166 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.781549 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.781774 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.781917 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.782041 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.782142 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.885187 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.885287 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.885300 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.885325 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.885341 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.988573 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.988631 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.988644 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.988671 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:31 crc kubenswrapper[4882]: I1002 16:18:31.988688 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:31Z","lastTransitionTime":"2025-10-02T16:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.091628 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.091680 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.091692 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.091710 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.091724 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.194620 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.194689 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.194702 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.194726 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.194740 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.219274 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/0.log" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.219347 4882 generic.go:334] "Generic (PLEG): container finished" podID="565a5a5f-e220-4ce6-86a7-f94f9dbe48c2" containerID="e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f" exitCode=1 Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.219402 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppcpg" event={"ID":"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2","Type":"ContainerDied","Data":"e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.219955 4882 scope.go:117] "RemoveContainer" containerID="e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.254114 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.274783 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.291764 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.297524 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.297924 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.297940 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.297961 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.297974 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.307049 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.322031 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.337457 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.353241 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.368156 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.382110 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.398679 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.402685 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.402739 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.402755 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.402781 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.402797 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.412534 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.424417 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.441698 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.461891 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.479117 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.497834 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.505443 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.505483 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.505493 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.505510 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.505523 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.512473 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.608167 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.608205 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.608228 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.608248 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.608264 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.711515 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.711757 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.711868 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.712008 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.712112 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.781687 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.797483 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.812346 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.814793 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.814846 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.814859 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.814878 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.814892 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.827255 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.842035 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.854427 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.871457 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.891613 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.905004 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.917752 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.917867 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.917895 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.917908 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.917926 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.917940 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:32Z","lastTransitionTime":"2025-10-02T16:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.934975 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.960658 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.975925 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:32 crc kubenswrapper[4882]: I1002 16:18:32.995182 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:32Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.008858 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.021555 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.022005 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.022065 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.022079 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.022126 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.022144 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.033052 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.124807 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.124863 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.124879 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.124901 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.124915 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.225104 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/0.log" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.225233 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppcpg" event={"ID":"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2","Type":"ContainerStarted","Data":"ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.226979 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.227178 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.227358 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.227527 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.227666 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.244072 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.258858 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.271205 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.284682 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.304987 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.320608 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.330783 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.330820 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.330833 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.330857 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.330873 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.341411 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.358150 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.376187 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.391265 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.409936 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.431940 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.433646 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.433696 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.433709 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.433728 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.433740 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.448576 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.465598 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.481508 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.497356 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.512548 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:33Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.536813 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.536870 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.536885 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.536919 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.536940 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.639093 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.639136 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.639146 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.639160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.639169 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.741903 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.741961 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.741975 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.741996 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.742012 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.760175 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.760241 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.760248 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.760195 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:33 crc kubenswrapper[4882]: E1002 16:18:33.760345 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:33 crc kubenswrapper[4882]: E1002 16:18:33.760412 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:33 crc kubenswrapper[4882]: E1002 16:18:33.760486 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:33 crc kubenswrapper[4882]: E1002 16:18:33.760579 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.845069 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.845123 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.845138 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.845156 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.845170 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.948484 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.948545 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.948564 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.948587 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:33 crc kubenswrapper[4882]: I1002 16:18:33.948602 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:33Z","lastTransitionTime":"2025-10-02T16:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.051719 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.051786 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.051797 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.051816 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.051828 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.155026 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.155139 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.155175 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.155207 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.155238 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.257506 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.257555 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.257568 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.257586 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.257599 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.361124 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.361194 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.361249 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.361278 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.361296 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.466362 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.466416 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.466427 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.466450 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.466462 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.568964 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.569004 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.569014 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.569036 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.569046 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.672448 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.672512 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.672526 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.672548 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.672562 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.775204 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.775285 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.775303 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.775324 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.775340 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.878081 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.878160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.878174 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.878199 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.878249 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.980113 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.980156 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.980166 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.980182 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:34 crc kubenswrapper[4882]: I1002 16:18:34.980192 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:34Z","lastTransitionTime":"2025-10-02T16:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.082596 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.082660 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.082679 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.082705 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.082725 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.185553 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.185601 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.185617 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.185642 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.185655 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.289160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.289269 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.289281 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.289304 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.289319 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.393072 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.393466 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.393552 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.393641 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.393725 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.496613 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.496696 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.496713 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.496752 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.496769 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.599679 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.600050 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.600154 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.600298 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.600371 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.702646 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.702690 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.702700 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.702718 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.702729 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.759369 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.759480 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:35 crc kubenswrapper[4882]: E1002 16:18:35.759581 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.759640 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.759648 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:35 crc kubenswrapper[4882]: E1002 16:18:35.759798 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:35 crc kubenswrapper[4882]: E1002 16:18:35.759954 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:35 crc kubenswrapper[4882]: E1002 16:18:35.760079 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.760920 4882 scope.go:117] "RemoveContainer" containerID="31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.805563 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.805877 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.805998 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.806074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.806138 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.908853 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.908907 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.908922 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.908942 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:35 crc kubenswrapper[4882]: I1002 16:18:35.908957 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:35Z","lastTransitionTime":"2025-10-02T16:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.012885 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.012984 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.013005 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.013029 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.013082 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.116392 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.116435 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.116447 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.116468 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.116481 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.220610 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.220697 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.220713 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.220734 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.220747 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.242552 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/2.log" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.248132 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.248978 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.278192 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.293593 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.313268 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.323482 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.323527 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.323538 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.323554 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.323568 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.333498 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.351001 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.363407 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.378295 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.395459 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.410911 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.424726 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.426443 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.426511 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.426531 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.426550 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.426563 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.437420 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.451827 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.472020 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.486660 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.501501 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.515273 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.530086 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.530136 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.530149 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.530170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.530181 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.531612 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:36Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.633031 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.633081 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.633095 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.633114 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.633126 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.735820 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.735882 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.735893 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.735915 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.735927 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.839006 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.839070 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.839085 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.839101 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.839112 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.942534 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.942595 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.942605 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.942625 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:36 crc kubenswrapper[4882]: I1002 16:18:36.942639 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:36Z","lastTransitionTime":"2025-10-02T16:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.046671 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.046722 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.046735 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.046755 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.046771 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.149476 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.149544 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.149559 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.149579 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.149591 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.251721 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.251755 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.251770 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.251788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.251799 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.255086 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/3.log" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.255869 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/2.log" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.258943 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" exitCode=1 Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.258992 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.259039 4882 scope.go:117] "RemoveContainer" containerID="31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.259864 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:18:37 crc kubenswrapper[4882]: E1002 16:18:37.260066 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.285963 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31a3603a826c5acfac68790d39a24af38bab5eaa6fb682ea3288621220f5da9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:08Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 16:18:08.654913 6557 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655204 6557 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 16:18:08.655784 6557 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 16:18:08.655833 6557 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 16:18:08.655844 6557 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 16:18:08.655858 6557 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 16:18:08.655863 6557 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 16:18:08.655909 6557 factory.go:656] Stopping watch factory\\\\nI1002 16:18:08.655926 6557 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 16:18:08.655932 6557 ovnkube.go:599] Stopped ovnkube\\\\nI1002 16:18:08.655947 6557 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 16:18:08.655958 6557 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:36Z\\\",\\\"message\\\":\\\"vices.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 16:18:36.754065 6904 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nF1002 16:18:36.754073 6904 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webho\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.300958 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.317267 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.336297 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.352848 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.354873 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.354932 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.354945 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.354965 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.354976 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.367416 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.387238 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.403599 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.418749 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.435799 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.451096 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.457849 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.457922 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.457933 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.457954 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.457966 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.465362 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.480982 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.495148 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.508445 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.525807 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.541617 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:37Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.561409 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.561547 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.561575 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.561608 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.561632 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.667134 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.667181 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.667192 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.667236 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.667253 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.759507 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.759554 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.759553 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:37 crc kubenswrapper[4882]: E1002 16:18:37.759716 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:37 crc kubenswrapper[4882]: E1002 16:18:37.759878 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.759946 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:37 crc kubenswrapper[4882]: E1002 16:18:37.760063 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:37 crc kubenswrapper[4882]: E1002 16:18:37.760297 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.771310 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.771370 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.771389 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.771413 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.771432 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.874841 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.874907 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.874921 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.874940 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.874952 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.977705 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.977746 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.977756 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.977772 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:37 crc kubenswrapper[4882]: I1002 16:18:37.977782 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:37Z","lastTransitionTime":"2025-10-02T16:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.080707 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.080761 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.080779 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.080804 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.080821 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.184309 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.184676 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.184741 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.184804 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.184862 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.266309 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/3.log" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.272058 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:18:38 crc kubenswrapper[4882]: E1002 16:18:38.272573 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.288262 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.288319 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.288332 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.288353 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.288365 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.292519 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.306843 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.323049 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.341862 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.365381 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.382770 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.392280 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.392684 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.392882 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.393154 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.393392 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.404022 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.417247 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.439443 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.472693 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:36Z\\\",\\\"message\\\":\\\"vices.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 16:18:36.754065 6904 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nF1002 16:18:36.754073 6904 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webho\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.490852 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.496649 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.496922 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.497063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.497192 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.497355 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.517690 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.542699 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.562960 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.585542 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.600303 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.600650 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.600784 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.600893 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.600983 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.605936 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.630728 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:38Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.705305 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.705769 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.705916 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.706047 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.706187 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.809085 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.809146 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.809161 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.809182 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.809201 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.912267 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.912509 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.912520 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.912540 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:38 crc kubenswrapper[4882]: I1002 16:18:38.912555 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:38Z","lastTransitionTime":"2025-10-02T16:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.015523 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.015939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.016072 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.016181 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.016291 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.119727 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.120131 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.120450 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.120669 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.120887 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.217194 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.217265 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.217277 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.217296 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.217308 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.234605 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.239920 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.239978 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.239988 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.240010 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.240023 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.255958 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.260910 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.260950 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.260960 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.260983 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.261000 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.274629 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.279344 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.279398 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.279413 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.279444 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.279462 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.295163 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.300339 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.300403 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.300419 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.300446 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.300460 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.314838 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:39Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.314986 4882 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.317308 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.317340 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.317349 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.317365 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.317376 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.420190 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.420256 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.420268 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.420286 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.420298 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.523303 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.523366 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.523376 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.523393 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.523407 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.626135 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.626188 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.626199 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.626230 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.626247 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.729817 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.729849 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.729858 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.729895 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.729906 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.759518 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.759531 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.759712 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.759566 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.759763 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.759924 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.760035 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:39 crc kubenswrapper[4882]: E1002 16:18:39.760094 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.833982 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.834029 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.834040 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.834060 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.834075 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.936280 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.936333 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.936344 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.936362 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:39 crc kubenswrapper[4882]: I1002 16:18:39.936377 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:39Z","lastTransitionTime":"2025-10-02T16:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.039443 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.039497 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.039512 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.039531 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.039572 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.142694 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.142764 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.142773 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.142792 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.142807 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.245978 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.246029 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.246042 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.246061 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.246073 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.348720 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.349158 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.349177 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.349202 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.349247 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.451825 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.451944 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.451963 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.452001 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.452035 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.555207 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.555304 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.555316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.555337 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.555352 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.658353 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.658447 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.658460 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.658513 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.658527 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.761413 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.761476 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.761490 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.761510 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.761523 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.865288 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.865354 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.865369 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.865392 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.865406 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.968696 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.968774 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.968788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.968807 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:40 crc kubenswrapper[4882]: I1002 16:18:40.968825 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:40Z","lastTransitionTime":"2025-10-02T16:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.072489 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.072568 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.072584 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.072606 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.072621 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.176458 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.176526 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.176542 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.176566 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.176581 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.279801 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.279859 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.279875 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.279898 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.279914 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.382933 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.382999 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.383012 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.383032 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.383046 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.486018 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.486085 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.486098 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.486118 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.486133 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.589374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.589430 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.589444 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.589467 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.589481 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.692460 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.692533 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.692551 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.692577 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.692594 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.759855 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.759923 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.760046 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:41 crc kubenswrapper[4882]: E1002 16:18:41.760054 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:41 crc kubenswrapper[4882]: E1002 16:18:41.760303 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.760361 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:41 crc kubenswrapper[4882]: E1002 16:18:41.760483 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:41 crc kubenswrapper[4882]: E1002 16:18:41.760711 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.775010 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.795501 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.795543 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.795555 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.795571 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.795583 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.898410 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.898456 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.898466 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.898483 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:41 crc kubenswrapper[4882]: I1002 16:18:41.898495 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:41Z","lastTransitionTime":"2025-10-02T16:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.001849 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.001915 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.001934 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.001961 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.001978 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.104904 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.104951 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.104962 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.104980 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.104994 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.208267 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.208364 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.208418 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.208451 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.208475 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.315462 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.315523 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.315543 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.315576 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.315593 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.418641 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.418682 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.418692 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.418709 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.418718 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.522797 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.522862 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.522880 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.522907 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.522926 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.626611 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.626679 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.626702 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.626734 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.626755 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.730100 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.730148 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.730161 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.730178 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.730189 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.783030 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.804992 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.825751 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.832296 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.832339 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.832350 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.832366 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.832377 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.840639 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.859670 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.893636 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:36Z\\\",\\\"message\\\":\\\"vices.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 16:18:36.754065 6904 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nF1002 16:18:36.754073 6904 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webho\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.907125 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.921099 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.935166 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.935243 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.935262 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.935285 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.935302 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:42Z","lastTransitionTime":"2025-10-02T16:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.937496 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.950833 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.963992 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.974259 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.985566 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebfa0dc2-86a2-48d7-b081-f2151de93b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6f309f493e6a6be7dfd71077f5a26a675984ac41209e28644ba55d6d5d6303\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25540079e35db5c08d3dcc3a7315fb05f70f9cfa0d6c3e5dedd94f29d7f074bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25540079e35db5c08d3dcc3a7315fb05f70f9cfa0d6c3e5dedd94f29d7f074bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:42 crc kubenswrapper[4882]: I1002 16:18:42.999557 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:42Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.013179 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:43Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.027395 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:43Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.038381 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.038439 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.038453 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.038473 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.038488 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.042670 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:43Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.056756 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:43Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.142308 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.142379 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.142399 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.142426 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.142448 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.246373 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.246419 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.246430 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.246449 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.246460 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.349731 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.349779 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.349788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.349803 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.349815 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.453993 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.454072 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.454102 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.454137 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.454157 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.557417 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.557517 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.557550 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.557637 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.557669 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.660878 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.660969 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.660998 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.661030 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.661052 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.760015 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.760087 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.760033 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:43 crc kubenswrapper[4882]: E1002 16:18:43.760310 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:43 crc kubenswrapper[4882]: E1002 16:18:43.760197 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.760598 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:43 crc kubenswrapper[4882]: E1002 16:18:43.760703 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:43 crc kubenswrapper[4882]: E1002 16:18:43.760795 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.763632 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.763681 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.763692 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.763711 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.763725 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.866668 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.866775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.866804 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.866839 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.866867 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.970011 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.970085 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.970102 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.970132 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:43 crc kubenswrapper[4882]: I1002 16:18:43.970152 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:43Z","lastTransitionTime":"2025-10-02T16:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.073339 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.073406 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.073421 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.073443 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.073460 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.176865 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.176924 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.176939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.176964 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.176980 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.280025 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.280063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.280071 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.280086 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.280099 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.383365 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.383431 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.383444 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.383465 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.383480 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.486994 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.487055 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.487064 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.487083 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.487093 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.590966 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.591037 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.591055 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.591082 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.591100 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.697751 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.697826 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.697840 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.697861 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.697878 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.801083 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.801132 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.801146 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.801170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.801183 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.904859 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.904944 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.904971 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.905030 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:44 crc kubenswrapper[4882]: I1002 16:18:44.905054 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:44Z","lastTransitionTime":"2025-10-02T16:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.008260 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.008326 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.008344 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.008368 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.008383 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.112004 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.112090 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.112125 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.112156 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.112174 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.216132 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.216191 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.216203 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.216243 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.216256 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.319202 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.319309 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.319330 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.319363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.319384 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.422611 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.422662 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.422673 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.422695 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.422708 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.526519 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.526579 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.526593 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.526616 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.526630 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.569384 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.569486 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.569664 4882 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.569670 4882 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.569785 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.569761548 +0000 UTC m=+148.318991075 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.569852 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.56981964 +0000 UTC m=+148.319049207 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.630349 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.630391 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.630405 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.630423 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.630435 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.670136 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.670206 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.670384 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.670411 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.670424 4882 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.670474 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.670459198 +0000 UTC m=+148.419688715 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.670384 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.670725 4882 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.670778 4882 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.670938 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.670890689 +0000 UTC m=+148.420120256 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.733583 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.733616 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.733624 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.733640 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.733651 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.760046 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.760184 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.760380 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.760429 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.760550 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.760621 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.760544 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.760708 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.837709 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.837766 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.837780 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.837803 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.837818 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.872602 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:18:45 crc kubenswrapper[4882]: E1002 16:18:45.872853 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.87281504 +0000 UTC m=+148.622044567 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.941236 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.941552 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.941624 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.941693 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:45 crc kubenswrapper[4882]: I1002 16:18:45.941772 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:45Z","lastTransitionTime":"2025-10-02T16:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.044033 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.044404 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.044521 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.044613 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.044677 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.147718 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.147990 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.148107 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.148245 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.148351 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.251324 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.251367 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.251378 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.251393 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.251430 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.354962 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.355015 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.355031 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.355056 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.355073 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.458807 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.458869 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.458888 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.458913 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.458933 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.562985 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.563055 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.563074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.563101 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.563122 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.666394 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.666460 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.666481 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.666508 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.666527 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.769247 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.769844 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.769912 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.770019 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.770096 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.874599 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.874689 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.874713 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.874746 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.874768 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.979773 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.979875 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.979899 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.979929 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:46 crc kubenswrapper[4882]: I1002 16:18:46.979948 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:46Z","lastTransitionTime":"2025-10-02T16:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.083016 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.083065 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.083074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.083093 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.083103 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.186199 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.186552 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.186624 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.186708 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.186777 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.291084 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.291160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.291173 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.291193 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.291629 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.394831 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.394895 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.394912 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.394938 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.394959 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.497838 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.497881 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.497892 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.497907 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.497918 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.601882 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.601931 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.601943 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.601964 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.601975 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.706040 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.706114 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.706133 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.706155 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.706175 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.760140 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.760161 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.761118 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.761249 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:47 crc kubenswrapper[4882]: E1002 16:18:47.761895 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:47 crc kubenswrapper[4882]: E1002 16:18:47.762636 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:47 crc kubenswrapper[4882]: E1002 16:18:47.763057 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:47 crc kubenswrapper[4882]: E1002 16:18:47.763076 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.812158 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.812204 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.812264 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.812283 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.812293 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.915469 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.915516 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.915526 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.915547 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:47 crc kubenswrapper[4882]: I1002 16:18:47.915564 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:47Z","lastTransitionTime":"2025-10-02T16:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.017569 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.017610 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.017622 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.017639 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.017651 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.121394 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.121466 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.121484 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.121514 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.121533 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.225462 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.225623 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.225648 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.225717 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.225739 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.331282 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.331331 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.331345 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.331366 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.331379 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.440735 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.440818 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.440853 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.440888 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.440911 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.544338 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.544371 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.544381 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.544397 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.544409 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.647183 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.647306 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.647330 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.647367 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.647390 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.750097 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.750138 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.750149 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.750168 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.750180 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.854481 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.854560 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.854587 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.854616 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.854635 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.958379 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.958472 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.958511 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.958552 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:48 crc kubenswrapper[4882]: I1002 16:18:48.958578 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:48Z","lastTransitionTime":"2025-10-02T16:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.062502 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.062580 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.062598 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.062626 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.062646 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.166197 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.166748 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.166936 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.167135 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.167366 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.270775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.270849 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.270868 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.270898 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.270916 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.373871 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.373948 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.373963 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.373989 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.374016 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.383509 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.383576 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.383600 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.383630 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.383652 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.406877 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.413004 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.413107 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.413131 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.413168 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.413192 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.434921 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.441597 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.441982 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.442159 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.442417 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.442649 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.464237 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.470059 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.470350 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.470550 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.470721 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.470864 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.490618 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.495813 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.495855 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.495865 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.495889 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.495901 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.515639 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:49Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.515817 4882 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.518732 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.518806 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.518827 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.518860 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.518879 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.622136 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.622289 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.622316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.622355 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.622378 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.729715 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.729780 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.729791 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.729810 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.729822 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.759667 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.759705 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.759771 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.759890 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.759970 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.760101 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.760305 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.761115 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.761813 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:18:49 crc kubenswrapper[4882]: E1002 16:18:49.762270 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.832713 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.832806 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.832832 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.832869 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.832894 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.936275 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.936352 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.936377 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.936405 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:49 crc kubenswrapper[4882]: I1002 16:18:49.936427 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:49Z","lastTransitionTime":"2025-10-02T16:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.040246 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.040286 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.040295 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.040311 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.040322 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.143322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.143481 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.143509 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.143593 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.143632 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.246755 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.246800 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.246813 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.246832 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.246847 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.350161 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.350272 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.350298 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.350330 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.350370 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.453549 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.453655 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.453676 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.453708 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.453728 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.566204 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.566258 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.566270 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.566283 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.566293 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.669155 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.669230 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.669243 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.669261 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.669273 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.771682 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.772054 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.772132 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.772242 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.772308 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.875890 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.875946 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.875958 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.875976 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.875992 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.978791 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.978854 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.978869 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.978896 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:50 crc kubenswrapper[4882]: I1002 16:18:50.978910 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:50Z","lastTransitionTime":"2025-10-02T16:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.082159 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.082207 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.082242 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.082263 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.082274 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.185117 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.185160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.185168 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.185183 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.185194 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.288865 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.288910 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.288921 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.288942 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.288954 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.391604 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.391644 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.391652 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.391666 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.391677 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.495946 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.496545 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.496574 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.496607 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.496628 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.599239 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.599282 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.599295 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.599314 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.599324 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.701346 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.701638 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.701708 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.701826 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.701916 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.760009 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.760006 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:51 crc kubenswrapper[4882]: E1002 16:18:51.760699 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.760337 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:51 crc kubenswrapper[4882]: E1002 16:18:51.760960 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:51 crc kubenswrapper[4882]: E1002 16:18:51.760749 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.760189 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:51 crc kubenswrapper[4882]: E1002 16:18:51.761307 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.805423 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.805488 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.805502 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.805524 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.805541 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.908111 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.908159 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.908171 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.908192 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:51 crc kubenswrapper[4882]: I1002 16:18:51.908205 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:51Z","lastTransitionTime":"2025-10-02T16:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.010809 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.010852 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.010863 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.010879 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.010891 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.113786 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.113849 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.113860 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.113879 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.113892 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.216838 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.216990 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.217059 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.217114 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.217136 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.320298 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.320353 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.320373 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.320393 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.320406 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.423102 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.423202 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.423248 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.423271 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.423289 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.526005 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.526090 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.526108 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.526127 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.526142 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.629300 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.629348 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.629362 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.629382 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.629394 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.732888 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.732939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.732956 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.732978 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.732991 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.773441 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.784818 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.800477 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebfa0dc2-86a2-48d7-b081-f2151de93b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6f309f493e6a6be7dfd71077f5a26a675984ac41209e28644ba55d6d5d6303\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25540079e35db5c08d3dcc3a7315fb05f70f9cfa0d6c3e5dedd94f29d7f074bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25540079e35db5c08d3dcc3a7315fb05f70f9cfa0d6c3e5dedd94f29d7f074bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.816430 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.847971 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.848055 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.848074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.848098 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.848118 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.863430 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.891328 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.906951 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.924492 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.945181 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.951246 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.951594 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.951780 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.951897 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.951997 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:52Z","lastTransitionTime":"2025-10-02T16:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.958703 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.972011 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.983920 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:52 crc kubenswrapper[4882]: I1002 16:18:52.999483 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:52Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.020392 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:36Z\\\",\\\"message\\\":\\\"vices.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 16:18:36.754065 6904 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nF1002 16:18:36.754073 6904 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webho\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.035765 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.051564 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.054337 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.054449 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.054513 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.054601 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.054663 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.065534 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.079193 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:53Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.157843 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.157926 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.157943 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.157967 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.157980 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.264845 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.264889 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.264902 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.264926 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.264940 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.368505 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.368541 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.368551 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.368581 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.368591 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.471045 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.471107 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.471125 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.471144 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.471157 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.575603 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.575713 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.575747 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.575789 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.575818 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.678170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.678282 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.678298 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.678338 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.678350 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.759234 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.759320 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:53 crc kubenswrapper[4882]: E1002 16:18:53.759395 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.759233 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.759256 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:53 crc kubenswrapper[4882]: E1002 16:18:53.759450 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:53 crc kubenswrapper[4882]: E1002 16:18:53.759540 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:53 crc kubenswrapper[4882]: E1002 16:18:53.759652 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.780878 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.780919 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.780932 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.780947 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.780960 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.883770 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.883807 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.883819 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.883837 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.883849 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.986306 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.986354 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.986366 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.986382 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:53 crc kubenswrapper[4882]: I1002 16:18:53.986393 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:53Z","lastTransitionTime":"2025-10-02T16:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.089383 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.089545 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.089559 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.089575 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.089586 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.192633 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.192681 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.192693 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.192710 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.192724 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.296105 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.296187 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.296202 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.296255 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.296269 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.399092 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.399140 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.399152 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.399172 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.399188 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.504090 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.504145 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.504161 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.504188 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.504205 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.608097 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.608152 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.608163 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.608185 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.608195 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.712116 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.712193 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.712234 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.712265 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.712284 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.816050 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.816098 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.816109 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.816126 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.816138 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.919377 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.919448 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.919471 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.919503 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:54 crc kubenswrapper[4882]: I1002 16:18:54.919527 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:54Z","lastTransitionTime":"2025-10-02T16:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.022870 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.022952 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.022966 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.022987 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.023000 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.127029 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.127085 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.127099 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.127115 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.127125 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.229388 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.229436 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.229451 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.229468 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.229481 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.331702 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.331746 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.331758 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.331773 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.331784 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.435137 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.435241 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.435257 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.435276 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.435299 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.537964 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.538023 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.538035 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.538054 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.538067 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.640855 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.640925 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.641138 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.641159 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.641173 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.744816 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.744869 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.744882 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.744898 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.744910 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.760253 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.760343 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.760409 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:55 crc kubenswrapper[4882]: E1002 16:18:55.760468 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.760621 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:55 crc kubenswrapper[4882]: E1002 16:18:55.760685 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:55 crc kubenswrapper[4882]: E1002 16:18:55.760746 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:55 crc kubenswrapper[4882]: E1002 16:18:55.760813 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.848303 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.848953 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.849081 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.849243 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.849350 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.951707 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.952011 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.952084 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.952149 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:55 crc kubenswrapper[4882]: I1002 16:18:55.952236 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:55Z","lastTransitionTime":"2025-10-02T16:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.055710 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.055775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.055788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.055805 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.055815 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.158512 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.158553 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.158565 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.158581 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.158592 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.261591 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.261710 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.261723 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.261738 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.261749 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.364597 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.364647 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.364657 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.364678 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.364726 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.468374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.468428 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.468438 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.468456 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.468468 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.571089 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.571459 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.571544 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.571619 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.571676 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.675444 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.675788 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.675956 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.676223 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.676554 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.779821 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.779867 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.779880 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.779895 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.779905 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.883002 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.883050 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.883067 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.883083 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.883096 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.985494 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.985545 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.985553 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.985568 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:56 crc kubenswrapper[4882]: I1002 16:18:56.985578 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:56Z","lastTransitionTime":"2025-10-02T16:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.089114 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.089181 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.089197 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.089242 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.089258 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.191807 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.191861 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.191876 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.191894 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.191906 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.294109 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.294416 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.294542 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.294613 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.294675 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.397547 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.397590 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.397602 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.397619 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.397634 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.500802 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.500856 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.500868 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.500885 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.500895 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.603341 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.603385 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.603397 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.603412 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.603423 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.705842 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.706363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.706463 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.706567 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.706659 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.759480 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.759555 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.759481 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:57 crc kubenswrapper[4882]: E1002 16:18:57.759705 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:57 crc kubenswrapper[4882]: E1002 16:18:57.759863 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:57 crc kubenswrapper[4882]: E1002 16:18:57.759913 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.759945 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:57 crc kubenswrapper[4882]: E1002 16:18:57.759998 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.809893 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.809938 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.809947 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.809965 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.809976 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.912895 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.912939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.912950 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.912966 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:57 crc kubenswrapper[4882]: I1002 16:18:57.912977 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:57Z","lastTransitionTime":"2025-10-02T16:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.015421 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.015465 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.015473 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.015488 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.015499 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.125299 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.125366 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.125382 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.125400 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.125413 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.228399 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.228439 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.228450 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.228464 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.228476 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.388290 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.388338 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.388348 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.388362 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.388373 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.491756 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.491805 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.491815 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.491830 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.491842 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.594164 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.594221 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.594231 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.594246 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.594257 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.696951 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.696982 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.696991 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.697003 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.697012 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.799838 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.799884 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.799894 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.799907 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.799919 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.902744 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.902787 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.902798 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.902810 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:58 crc kubenswrapper[4882]: I1002 16:18:58.902820 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:58Z","lastTransitionTime":"2025-10-02T16:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.006064 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.006281 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.006312 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.006349 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.006377 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.109695 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.109790 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.109819 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.109863 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.109890 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.213771 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.213842 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.213860 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.213894 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.213910 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.316922 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.316967 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.316977 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.316992 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.317003 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.419512 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.419566 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.419579 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.419596 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.419609 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.522322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.522399 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.522412 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.522428 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.522441 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.625004 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.625056 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.625067 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.625083 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.625094 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.727778 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.727821 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.727831 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.727850 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.727860 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.759274 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.759355 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.759384 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.759437 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.759302 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.759600 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.759636 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.759695 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.830727 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.830777 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.830790 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.830806 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.830819 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.863569 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.863625 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.863637 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.863657 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.863667 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.881242 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.885561 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.885630 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.885641 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.885657 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.885683 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.903602 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.908162 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.908280 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.908309 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.908352 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.908383 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.926281 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.931364 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.931415 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.931426 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.931440 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.931452 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.951452 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.957389 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.957469 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.957496 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.957525 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.957546 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.972560 4882 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"06757550-4393-42fc-bde7-149710ea74c8\\\",\\\"systemUUID\\\":\\\"70d9d1e8-68ca-4a91-a8cc-9deeaf3b26ed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:18:59Z is after 2025-08-24T17:21:41Z" Oct 02 16:18:59 crc kubenswrapper[4882]: E1002 16:18:59.972808 4882 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.975133 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.975244 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.975273 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.975299 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:18:59 crc kubenswrapper[4882]: I1002 16:18:59.975319 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:18:59Z","lastTransitionTime":"2025-10-02T16:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.079323 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.079390 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.079403 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.079420 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.079430 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.183463 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.183520 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.183544 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.183579 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.183601 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.286392 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.286434 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.286443 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.286460 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.286473 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.391156 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.391284 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.391308 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.391341 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.391363 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.494521 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.494572 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.494585 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.494604 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.494618 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.598024 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.598063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.598075 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.598089 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.598098 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.701496 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.701563 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.701572 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.701588 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.701597 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.804731 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.804781 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.804796 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.804815 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.804828 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.907669 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.907720 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.907736 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.907761 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:00 crc kubenswrapper[4882]: I1002 16:19:00.907779 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:00Z","lastTransitionTime":"2025-10-02T16:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.011063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.011128 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.011189 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.011204 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.011252 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.114568 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.114651 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.114671 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.114702 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.114744 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.218516 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.218601 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.218625 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.218657 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.218675 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.321760 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.321808 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.321819 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.321834 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.321845 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.424111 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.424161 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.424170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.424189 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.424203 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.527162 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.527247 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.527261 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.527285 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.527298 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.629775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.629820 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.629830 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.629844 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.629854 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.732787 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.732838 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.732847 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.732866 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.732880 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.759663 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.759805 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:01 crc kubenswrapper[4882]: E1002 16:19:01.759833 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.759877 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.759903 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:01 crc kubenswrapper[4882]: E1002 16:19:01.760127 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:01 crc kubenswrapper[4882]: E1002 16:19:01.760647 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:01 crc kubenswrapper[4882]: E1002 16:19:01.760802 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.760990 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:19:01 crc kubenswrapper[4882]: E1002 16:19:01.761147 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.836074 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.836123 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.836132 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.836149 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.836158 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.938952 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.939038 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.939049 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.939063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:01 crc kubenswrapper[4882]: I1002 16:19:01.939071 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:01Z","lastTransitionTime":"2025-10-02T16:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.041550 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.041630 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.041643 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.041661 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.041672 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.144793 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.144848 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.144859 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.144877 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.144892 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.247989 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.248036 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.248045 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.248064 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.248075 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.351068 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.351106 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.351116 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.351131 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.351141 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.356661 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:02 crc kubenswrapper[4882]: E1002 16:19:02.356823 4882 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:19:02 crc kubenswrapper[4882]: E1002 16:19:02.356881 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs podName:9f988cab-7579-4a12-8df6-e3e91e42f7df nodeName:}" failed. No retries permitted until 2025-10-02 16:20:06.35686474 +0000 UTC m=+165.106094267 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs") pod "network-metrics-daemon-6ldvk" (UID: "9f988cab-7579-4a12-8df6-e3e91e42f7df") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.454023 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.454063 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.454073 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.454089 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.454100 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.557390 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.557426 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.557436 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.557452 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.557465 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.660345 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.660385 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.660394 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.660408 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.660417 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.763371 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.763402 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.763413 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.763427 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.763437 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.779153 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd74899-256b-4b2c-bcd7-51fb1d08991b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24896651ece6bceae62adc93e947c1e69165c1036ee57311d6ea74f914ac57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8q9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jxblv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.794042 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pm5z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cb59ee5-09c2-4d31-b1aa-1d2a57035275\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c41b3cd11f9265fb0f9ddc46a3871e972ff842add8c9a1605e30dbf6ed8e04e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wftw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pm5z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.805129 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebfa0dc2-86a2-48d7-b081-f2151de93b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e6f309f493e6a6be7dfd71077f5a26a675984ac41209e28644ba55d6d5d6303\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25540079e35db5c08d3dcc3a7315fb05f70f9cfa0d6c3e5dedd94f29d7f074bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25540079e35db5c08d3dcc3a7315fb05f70f9cfa0d6c3e5dedd94f29d7f074bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.819047 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"121ff80b-31ca-4be3-907c-d493ba3532ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01254a8123333e98487e1f30bd4d8665ae7d0d8538fd8e53078ad7d0e3d9300d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd3276b78a0ffd6417958cb28fe77b6442f36fe522fbdb624f39b8257e956b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7eb776f4ecaef58c08559b307a360ce9f4fc2cee0249d9b6862a757467ab9446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65782d2c54d2a49539dd8657108bee4b6c2ee43cbcdeee964e4a04124354f560\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee07dc150bc07ac8cba3fdb803f354ce76cd61b9f546986e244df614abb5cfa9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T16:17:42Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 16:17:26.073146 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 16:17:26.076077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-394164313/tls.crt::/tmp/serving-cert-394164313/tls.key\\\\\\\"\\\\nI1002 16:17:41.404832 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 16:17:41.421310 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 16:17:41.421341 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 16:17:41.421376 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 16:17:41.421384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 16:17:41.449157 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1002 16:17:41.449178 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 16:17:41.449198 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 16:17:41.449209 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 16:17:41.449215 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 16:17:41.449235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 16:17:41.449237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 16:17:41.452519 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://823fcbd515c4062278266fe96362cb29a3feb8ce36177454bdd07db363b3ada3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e0a5ab323e7c4e169ec9e5c706d1cac1a74c9faca81263dd763f0d9faf90931\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.833408 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.845888 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57ea4ef9107287afd9fba6a6271d3f83e2ea4423ba751d65f674c41cee0a17ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.860757 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.865538 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.865589 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.865600 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.865617 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.865630 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.874444 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ppcpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:31Z\\\",\\\"message\\\":\\\"2025-10-02T16:17:46+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb\\\\n2025-10-02T16:17:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c6626744-09d3-4e1e-903d-eeee5f1b4bcb to /host/opt/cni/bin/\\\\n2025-10-02T16:17:46Z [verbose] multus-daemon started\\\\n2025-10-02T16:17:46Z [verbose] Readiness Indicator file check\\\\n2025-10-02T16:18:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ld9lr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ppcpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.885673 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f988cab-7579-4a12-8df6-e3e91e42f7df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tvplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:58Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6ldvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.899358 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c74771fcf09bcefa43f8591f481ff0c17e7f9b36bb3d1071f496a070465529ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.912739 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.924900 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74fe239eb7dcb5c8a3bc1bcc72eed0e4b64b1c5ae58d5e640d5666d49eccdff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e5dd2d766acda8752def42ec3f8b88b3abcc400fc26e2ff494ee76da0ce1c1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.935858 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5mdgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f246815-70b9-4dc7-972c-76ba716075ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a883cb6f410d8ab5850559c084c5be10a51002e1f94481f598ff99a76cf2623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7mfz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5mdgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.951575 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75c66302-c57d-41c8-a014-97f26deffd27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b2fa77e4aa35fc0e2b7572d9a0dac6ff5c547363003a476c8b83c88c3a79814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1abcb50de09cc5a4eebba8c3a586cea56b5c15832c4c019c7d8e5dda75a1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29876c8668ba26c19c4ae650a3d4f1c0db1e759db76ce9dd546b5c5069e561ba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32b8903c6e0877400becff951aeb45056b3ef3fbab24a05512b0ae7c381d4aab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeaacd1f7a2150e878cb17c48e5e945e63e81b27e88f21a10933a9c78f27c576\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a2667e6cebfff40a338b814b68eb57f389a83701142ec960dd96e19cc3dbefe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71695962d0bfebdbacb81eb1de9da5807d2c0c5c79de51bd55bef38503549525\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5bq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5wxmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.968477 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.968517 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.968563 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.968582 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.968597 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:02Z","lastTransitionTime":"2025-10-02T16:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.976063 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7911af1a-fc82-463b-b72d-9c55e5073e45\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T16:18:36Z\\\",\\\"message\\\":\\\"vices.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 16:18:36.754065 6904 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nF1002 16:18:36.754073 6904 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webho\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T16:18:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7zv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p6qjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:02 crc kubenswrapper[4882]: I1002 16:19:02.990664 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0109f456-9243-4bdc-b121-682da0443b4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb744de6fa45392511c561edc46500cf56422eaf45e6995ab779b7fc173f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28c66aac15df9050bbb46ccf812e3d92c42b824fc4aaf64513513c488b128a29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2591ac07065c7940d5f496a378f0b0c259639427148f6b47ce5952c0e1bfbc90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbd6c446f408795c03888bb4a3aebbf5c0e00f6f82b0fa8ddacfbc6d005d87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:02Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.003780 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14db90a4-dee7-4295-ba1b-ade2efacc365\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://383645694a8eae5a154575528752fdad0ad208a54b4312daa2d6b937bb5857d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1691bf0f4975d178560bfad1a06dc212e4b62f7d51fe729b94fbae39ed0b0997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be5b4ea543780afef271f06c9accd8109e50912c0bd2f05649cacd123fbb211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7495152db05b9fea10d3de6a6e4f1153d08a39f8fdae49bd0ac7ab5c10af0134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T16:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T16:17:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.014387 4882 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e41edd9-d556-45f0-b911-a7d65ecc7ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T16:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bd5b6b86575b2a0544d74b73a717b674a2342898dc89b8ec3ac2fbab19bb258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1767ca7cf61eb83951f15deebf9e8c300084af821b0cd83f023d2a6568c7a194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T16:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz6bn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T16:17:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bxbkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T16:19:03Z is after 2025-08-24T17:21:41Z" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.071855 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.071907 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.071920 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.071938 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.071950 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.174939 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.175006 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.175025 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.175052 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.175071 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.282918 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.282984 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.282997 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.283018 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.283032 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.387945 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.387988 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.387997 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.388012 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.388024 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.491256 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.491313 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.491333 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.491363 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.491380 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.594428 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.594471 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.594483 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.594498 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.594509 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.697863 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.697918 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.697931 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.697951 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.697962 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.759617 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.759673 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.759714 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.759950 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:03 crc kubenswrapper[4882]: E1002 16:19:03.760041 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:03 crc kubenswrapper[4882]: E1002 16:19:03.760473 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:03 crc kubenswrapper[4882]: E1002 16:19:03.760613 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:03 crc kubenswrapper[4882]: E1002 16:19:03.760657 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.782684 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.801332 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.801393 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.801406 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.801422 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.801433 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.904390 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.904445 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.904454 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.904472 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:03 crc kubenswrapper[4882]: I1002 16:19:03.904482 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:03Z","lastTransitionTime":"2025-10-02T16:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.008674 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.008755 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.008776 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.008803 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.008822 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.112232 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.112289 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.112305 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.112328 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.112343 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.215926 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.215974 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.215986 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.216005 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.216021 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.318910 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.318971 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.318988 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.319014 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.319031 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.421545 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.421591 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.421599 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.421614 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.421626 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.525136 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.525181 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.525190 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.525205 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.525238 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.628655 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.628710 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.628721 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.628736 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.628756 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.734629 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.734693 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.734702 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.734718 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.734729 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.837308 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.837353 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.837361 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.837374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.837384 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.940919 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.940961 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.940971 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.940987 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:04 crc kubenswrapper[4882]: I1002 16:19:04.941001 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:04Z","lastTransitionTime":"2025-10-02T16:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.043374 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.043456 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.043485 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.043519 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.043542 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.146200 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.146297 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.146310 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.146331 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.146346 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.250276 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.250339 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.250354 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.250378 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.250394 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.354158 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.354269 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.354290 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.354312 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.354327 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.457973 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.458383 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.458501 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.458631 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.458744 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.562256 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.562307 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.562316 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.562336 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.562347 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.665157 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.665256 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.665276 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.665300 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.665318 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.759888 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.760002 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.759901 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.759901 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:05 crc kubenswrapper[4882]: E1002 16:19:05.760248 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:05 crc kubenswrapper[4882]: E1002 16:19:05.760429 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:05 crc kubenswrapper[4882]: E1002 16:19:05.760518 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:05 crc kubenswrapper[4882]: E1002 16:19:05.760668 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.768502 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.768649 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.768675 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.768702 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.768720 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.872235 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.872308 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.872322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.872342 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.872355 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.975157 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.975200 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.975226 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.975245 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:05 crc kubenswrapper[4882]: I1002 16:19:05.975257 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:05Z","lastTransitionTime":"2025-10-02T16:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.078362 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.078436 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.078459 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.078490 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.078517 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.181882 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.181941 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.181954 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.181973 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.181991 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.285029 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.285148 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.285170 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.285201 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.285279 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.388862 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.388925 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.388934 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.388950 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.388960 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.492253 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.492334 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.492361 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.492394 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.492417 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.596105 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.596152 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.596164 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.596184 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.596195 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.699561 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.699613 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.699623 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.699642 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.699655 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.803038 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.803082 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.803091 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.803104 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.803116 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.906160 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.906243 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.906254 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.906272 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:06 crc kubenswrapper[4882]: I1002 16:19:06.906284 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:06Z","lastTransitionTime":"2025-10-02T16:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.010054 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.010115 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.010133 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.010158 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.010177 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.113560 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.113608 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.113616 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.113636 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.113648 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.216632 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.216673 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.216682 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.216694 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.216702 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.319652 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.319709 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.319720 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.319740 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.319752 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.422462 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.422556 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.422577 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.422597 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.422610 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.527425 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.527481 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.527491 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.527507 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.527519 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.630148 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.630195 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.630243 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.630271 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.630282 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.732665 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.732713 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.732727 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.732751 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.732766 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.759285 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.759338 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.759285 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:07 crc kubenswrapper[4882]: E1002 16:19:07.759481 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.759301 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:07 crc kubenswrapper[4882]: E1002 16:19:07.759801 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:07 crc kubenswrapper[4882]: E1002 16:19:07.759904 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:07 crc kubenswrapper[4882]: E1002 16:19:07.759531 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.836001 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.836065 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.836079 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.836099 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.836113 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.939661 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.939723 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.939738 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.939760 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:07 crc kubenswrapper[4882]: I1002 16:19:07.939775 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:07Z","lastTransitionTime":"2025-10-02T16:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.042770 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.042885 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.042905 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.042935 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.042954 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.146139 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.146263 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.146288 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.146322 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.146344 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.249686 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.249765 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.249785 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.249815 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.249833 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.352927 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.352964 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.352983 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.353000 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.353010 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.456652 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.456735 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.456751 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.456775 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.456792 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.559416 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.559484 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.559498 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.559521 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.559544 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.662027 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.662078 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.662088 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.662105 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.662116 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.764340 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.764391 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.764402 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.764417 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.764428 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.867791 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.867850 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.867864 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.867890 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.867909 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.971917 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.972058 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.972084 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.972112 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:08 crc kubenswrapper[4882]: I1002 16:19:08.972131 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:08Z","lastTransitionTime":"2025-10-02T16:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.075478 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.076299 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.076347 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.076381 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.076399 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.180003 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.180064 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.180077 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.180100 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.180114 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.284096 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.284150 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.284165 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.284193 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.284246 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.387405 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.387470 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.387481 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.387501 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.387515 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.490703 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.490769 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.490783 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.490804 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.490816 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.593704 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.593793 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.593821 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.593854 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.593881 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.697061 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.697110 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.697119 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.697136 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.697146 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.759475 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.759588 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.759584 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.759591 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:09 crc kubenswrapper[4882]: E1002 16:19:09.759784 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:09 crc kubenswrapper[4882]: E1002 16:19:09.759834 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:09 crc kubenswrapper[4882]: E1002 16:19:09.759893 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:09 crc kubenswrapper[4882]: E1002 16:19:09.760089 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.799594 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.799636 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.799648 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.799663 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.799675 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.902977 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.903019 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.903028 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.903044 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:09 crc kubenswrapper[4882]: I1002 16:19:09.903054 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:09Z","lastTransitionTime":"2025-10-02T16:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.006829 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.006901 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.006925 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.006953 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.006973 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:10Z","lastTransitionTime":"2025-10-02T16:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.083052 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.083118 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.083134 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.083159 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.083182 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:10Z","lastTransitionTime":"2025-10-02T16:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.118052 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.118103 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.118113 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.118128 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.118139 4882 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T16:19:10Z","lastTransitionTime":"2025-10-02T16:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.144074 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn"] Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.144512 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.147261 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.147528 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.148129 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.148863 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.187199 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=60.187163793 podStartE2EDuration="1m0.187163793s" podCreationTimestamp="2025-10-02 16:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.185848631 +0000 UTC m=+108.935078198" watchObservedRunningTime="2025-10-02 16:19:10.187163793 +0000 UTC m=+108.936393360" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.187572 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.187550963 podStartE2EDuration="1m29.187550963s" podCreationTimestamp="2025-10-02 16:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.169208626 +0000 UTC m=+108.918438183" watchObservedRunningTime="2025-10-02 16:19:10.187550963 +0000 UTC m=+108.936780500" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.209809 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bxbkp" podStartSLOduration=86.209777066 podStartE2EDuration="1m26.209777066s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.20953524 +0000 UTC m=+108.958764777" watchObservedRunningTime="2025-10-02 16:19:10.209777066 +0000 UTC m=+108.959006613" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.247762 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podStartSLOduration=87.247737403 podStartE2EDuration="1m27.247737403s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.231800744 +0000 UTC m=+108.981030291" watchObservedRunningTime="2025-10-02 16:19:10.247737403 +0000 UTC m=+108.996966930" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.260059 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a2b5c396-25eb-4af3-a4d6-63d000f21678-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.260125 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2b5c396-25eb-4af3-a4d6-63d000f21678-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.260150 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b5c396-25eb-4af3-a4d6-63d000f21678-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.260172 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2b5c396-25eb-4af3-a4d6-63d000f21678-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.260265 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a2b5c396-25eb-4af3-a4d6-63d000f21678-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.269343 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pm5z5" podStartSLOduration=87.26931176 podStartE2EDuration="1m27.26931176s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.247987049 +0000 UTC m=+108.997216576" watchObservedRunningTime="2025-10-02 16:19:10.26931176 +0000 UTC m=+109.018541287" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.282774 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.282737958 podStartE2EDuration="29.282737958s" podCreationTimestamp="2025-10-02 16:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.282267876 +0000 UTC m=+109.031497413" watchObservedRunningTime="2025-10-02 16:19:10.282737958 +0000 UTC m=+109.031967475" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.298400 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.29837672 podStartE2EDuration="1m28.29837672s" podCreationTimestamp="2025-10-02 16:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.297898039 +0000 UTC m=+109.047127566" watchObservedRunningTime="2025-10-02 16:19:10.29837672 +0000 UTC m=+109.047606257" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.361363 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b5c396-25eb-4af3-a4d6-63d000f21678-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.361423 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2b5c396-25eb-4af3-a4d6-63d000f21678-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.361469 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a2b5c396-25eb-4af3-a4d6-63d000f21678-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.361517 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a2b5c396-25eb-4af3-a4d6-63d000f21678-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.361538 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2b5c396-25eb-4af3-a4d6-63d000f21678-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.362544 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a2b5c396-25eb-4af3-a4d6-63d000f21678-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.362641 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a2b5c396-25eb-4af3-a4d6-63d000f21678-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.362878 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2b5c396-25eb-4af3-a4d6-63d000f21678-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.368641 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b5c396-25eb-4af3-a4d6-63d000f21678-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.380909 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2b5c396-25eb-4af3-a4d6-63d000f21678-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tczdn\" (UID: \"a2b5c396-25eb-4af3-a4d6-63d000f21678\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.388315 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ppcpg" podStartSLOduration=87.388290836 podStartE2EDuration="1m27.388290836s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.362819684 +0000 UTC m=+109.112049211" watchObservedRunningTime="2025-10-02 16:19:10.388290836 +0000 UTC m=+109.137520363" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.428345 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=7.4283211940000005 podStartE2EDuration="7.428321194s" podCreationTimestamp="2025-10-02 16:19:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.428309063 +0000 UTC m=+109.177538600" watchObservedRunningTime="2025-10-02 16:19:10.428321194 +0000 UTC m=+109.177550721" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.470843 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" Oct 02 16:19:10 crc kubenswrapper[4882]: I1002 16:19:10.524660 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5mdgg" podStartSLOduration=87.524632555 podStartE2EDuration="1m27.524632555s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.506662977 +0000 UTC m=+109.255892514" watchObservedRunningTime="2025-10-02 16:19:10.524632555 +0000 UTC m=+109.273862092" Oct 02 16:19:10 crc kubenswrapper[4882]: W1002 16:19:10.529163 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b5c396_25eb_4af3_a4d6_63d000f21678.slice/crio-4ab0e9e844d7159bbefdcabc1c5f796c455f53597c764172260502b56cc0b411 WatchSource:0}: Error finding container 4ab0e9e844d7159bbefdcabc1c5f796c455f53597c764172260502b56cc0b411: Status 404 returned error can't find the container with id 4ab0e9e844d7159bbefdcabc1c5f796c455f53597c764172260502b56cc0b411 Oct 02 16:19:11 crc kubenswrapper[4882]: I1002 16:19:11.436975 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" event={"ID":"a2b5c396-25eb-4af3-a4d6-63d000f21678","Type":"ContainerStarted","Data":"db8c989775a9b9bc4a018d4cffcac349dc060eb9c9e2762a286fa4163f7d620d"} Oct 02 16:19:11 crc kubenswrapper[4882]: I1002 16:19:11.437035 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" event={"ID":"a2b5c396-25eb-4af3-a4d6-63d000f21678","Type":"ContainerStarted","Data":"4ab0e9e844d7159bbefdcabc1c5f796c455f53597c764172260502b56cc0b411"} Oct 02 16:19:11 crc kubenswrapper[4882]: I1002 16:19:11.454005 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5wxmw" podStartSLOduration=88.453984212 podStartE2EDuration="1m28.453984212s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:10.526114871 +0000 UTC m=+109.275344398" watchObservedRunningTime="2025-10-02 16:19:11.453984212 +0000 UTC m=+110.203213739" Oct 02 16:19:11 crc kubenswrapper[4882]: I1002 16:19:11.759940 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:11 crc kubenswrapper[4882]: I1002 16:19:11.760081 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:11 crc kubenswrapper[4882]: E1002 16:19:11.760305 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:11 crc kubenswrapper[4882]: I1002 16:19:11.760376 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:11 crc kubenswrapper[4882]: E1002 16:19:11.760117 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:11 crc kubenswrapper[4882]: E1002 16:19:11.760440 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:11 crc kubenswrapper[4882]: I1002 16:19:11.760457 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:11 crc kubenswrapper[4882]: E1002 16:19:11.760525 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:13 crc kubenswrapper[4882]: I1002 16:19:13.760052 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:13 crc kubenswrapper[4882]: I1002 16:19:13.760400 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:13 crc kubenswrapper[4882]: I1002 16:19:13.760440 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:13 crc kubenswrapper[4882]: E1002 16:19:13.760536 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:13 crc kubenswrapper[4882]: I1002 16:19:13.760813 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:13 crc kubenswrapper[4882]: E1002 16:19:13.760889 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:13 crc kubenswrapper[4882]: E1002 16:19:13.761029 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:13 crc kubenswrapper[4882]: E1002 16:19:13.761154 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:14 crc kubenswrapper[4882]: I1002 16:19:14.760305 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:19:14 crc kubenswrapper[4882]: E1002 16:19:14.760601 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p6qjz_openshift-ovn-kubernetes(7911af1a-fc82-463b-b72d-9c55e5073e45)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" Oct 02 16:19:15 crc kubenswrapper[4882]: I1002 16:19:15.759934 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:15 crc kubenswrapper[4882]: I1002 16:19:15.759961 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:15 crc kubenswrapper[4882]: I1002 16:19:15.760020 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:15 crc kubenswrapper[4882]: I1002 16:19:15.760100 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:15 crc kubenswrapper[4882]: E1002 16:19:15.760225 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:15 crc kubenswrapper[4882]: E1002 16:19:15.760355 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:15 crc kubenswrapper[4882]: E1002 16:19:15.760545 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:15 crc kubenswrapper[4882]: E1002 16:19:15.760691 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:17 crc kubenswrapper[4882]: I1002 16:19:17.760196 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:17 crc kubenswrapper[4882]: I1002 16:19:17.760256 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:17 crc kubenswrapper[4882]: I1002 16:19:17.760196 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:17 crc kubenswrapper[4882]: E1002 16:19:17.760377 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:17 crc kubenswrapper[4882]: E1002 16:19:17.760507 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:17 crc kubenswrapper[4882]: I1002 16:19:17.760527 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:17 crc kubenswrapper[4882]: E1002 16:19:17.760901 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:17 crc kubenswrapper[4882]: E1002 16:19:17.760995 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:18 crc kubenswrapper[4882]: I1002 16:19:18.463314 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/1.log" Oct 02 16:19:18 crc kubenswrapper[4882]: I1002 16:19:18.464049 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/0.log" Oct 02 16:19:18 crc kubenswrapper[4882]: I1002 16:19:18.464104 4882 generic.go:334] "Generic (PLEG): container finished" podID="565a5a5f-e220-4ce6-86a7-f94f9dbe48c2" containerID="ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e" exitCode=1 Oct 02 16:19:18 crc kubenswrapper[4882]: I1002 16:19:18.464139 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppcpg" event={"ID":"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2","Type":"ContainerDied","Data":"ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e"} Oct 02 16:19:18 crc kubenswrapper[4882]: I1002 16:19:18.464179 4882 scope.go:117] "RemoveContainer" containerID="e9603a19b13f1180671fc78911cc42cf9661becfd6ee53ace0c77feb4be1680f" Oct 02 16:19:18 crc kubenswrapper[4882]: I1002 16:19:18.464776 4882 scope.go:117] "RemoveContainer" containerID="ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e" Oct 02 16:19:18 crc kubenswrapper[4882]: E1002 16:19:18.464995 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ppcpg_openshift-multus(565a5a5f-e220-4ce6-86a7-f94f9dbe48c2)\"" pod="openshift-multus/multus-ppcpg" podUID="565a5a5f-e220-4ce6-86a7-f94f9dbe48c2" Oct 02 16:19:18 crc kubenswrapper[4882]: I1002 16:19:18.483109 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tczdn" podStartSLOduration=95.483083939 podStartE2EDuration="1m35.483083939s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:11.454482015 +0000 UTC m=+110.203711542" watchObservedRunningTime="2025-10-02 16:19:18.483083939 +0000 UTC m=+117.232313476" Oct 02 16:19:19 crc kubenswrapper[4882]: I1002 16:19:19.470305 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/1.log" Oct 02 16:19:19 crc kubenswrapper[4882]: I1002 16:19:19.759990 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:19 crc kubenswrapper[4882]: I1002 16:19:19.760031 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:19 crc kubenswrapper[4882]: I1002 16:19:19.760089 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:19 crc kubenswrapper[4882]: E1002 16:19:19.760127 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:19 crc kubenswrapper[4882]: I1002 16:19:19.760004 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:19 crc kubenswrapper[4882]: E1002 16:19:19.760186 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:19 crc kubenswrapper[4882]: E1002 16:19:19.760358 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:19 crc kubenswrapper[4882]: E1002 16:19:19.760468 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:21 crc kubenswrapper[4882]: I1002 16:19:21.759552 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:21 crc kubenswrapper[4882]: I1002 16:19:21.759684 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:21 crc kubenswrapper[4882]: E1002 16:19:21.759743 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:21 crc kubenswrapper[4882]: I1002 16:19:21.759586 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:21 crc kubenswrapper[4882]: I1002 16:19:21.759610 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:21 crc kubenswrapper[4882]: E1002 16:19:21.759844 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:21 crc kubenswrapper[4882]: E1002 16:19:21.759945 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:21 crc kubenswrapper[4882]: E1002 16:19:21.759994 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:22 crc kubenswrapper[4882]: E1002 16:19:22.708922 4882 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 16:19:22 crc kubenswrapper[4882]: E1002 16:19:22.875793 4882 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 16:19:23 crc kubenswrapper[4882]: I1002 16:19:23.759680 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:23 crc kubenswrapper[4882]: I1002 16:19:23.759690 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:23 crc kubenswrapper[4882]: I1002 16:19:23.759680 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:23 crc kubenswrapper[4882]: E1002 16:19:23.759993 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:23 crc kubenswrapper[4882]: E1002 16:19:23.759831 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:23 crc kubenswrapper[4882]: E1002 16:19:23.760006 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:23 crc kubenswrapper[4882]: I1002 16:19:23.759831 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:23 crc kubenswrapper[4882]: E1002 16:19:23.760109 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:25 crc kubenswrapper[4882]: I1002 16:19:25.759884 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:25 crc kubenswrapper[4882]: I1002 16:19:25.759959 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:25 crc kubenswrapper[4882]: I1002 16:19:25.760003 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:25 crc kubenswrapper[4882]: E1002 16:19:25.760079 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:25 crc kubenswrapper[4882]: I1002 16:19:25.760287 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:25 crc kubenswrapper[4882]: E1002 16:19:25.760327 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:25 crc kubenswrapper[4882]: E1002 16:19:25.760462 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:25 crc kubenswrapper[4882]: E1002 16:19:25.760522 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:25 crc kubenswrapper[4882]: I1002 16:19:25.761525 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:19:26 crc kubenswrapper[4882]: I1002 16:19:26.500114 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/3.log" Oct 02 16:19:26 crc kubenswrapper[4882]: I1002 16:19:26.503513 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerStarted","Data":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} Oct 02 16:19:26 crc kubenswrapper[4882]: I1002 16:19:26.503951 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:19:26 crc kubenswrapper[4882]: I1002 16:19:26.674512 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podStartSLOduration=102.674485732 podStartE2EDuration="1m42.674485732s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:26.530133167 +0000 UTC m=+125.279362694" watchObservedRunningTime="2025-10-02 16:19:26.674485732 +0000 UTC m=+125.423715259" Oct 02 16:19:26 crc kubenswrapper[4882]: I1002 16:19:26.674961 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6ldvk"] Oct 02 16:19:26 crc kubenswrapper[4882]: I1002 16:19:26.675073 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:26 crc kubenswrapper[4882]: E1002 16:19:26.675182 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:27 crc kubenswrapper[4882]: I1002 16:19:27.759661 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:27 crc kubenswrapper[4882]: I1002 16:19:27.759725 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:27 crc kubenswrapper[4882]: I1002 16:19:27.759799 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:27 crc kubenswrapper[4882]: E1002 16:19:27.760126 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:27 crc kubenswrapper[4882]: E1002 16:19:27.760406 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:27 crc kubenswrapper[4882]: E1002 16:19:27.760503 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:27 crc kubenswrapper[4882]: E1002 16:19:27.877001 4882 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 16:19:28 crc kubenswrapper[4882]: I1002 16:19:28.760040 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:28 crc kubenswrapper[4882]: E1002 16:19:28.760244 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:29 crc kubenswrapper[4882]: I1002 16:19:29.759850 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:29 crc kubenswrapper[4882]: I1002 16:19:29.759865 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:29 crc kubenswrapper[4882]: E1002 16:19:29.760062 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:29 crc kubenswrapper[4882]: E1002 16:19:29.760400 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:29 crc kubenswrapper[4882]: I1002 16:19:29.760777 4882 scope.go:117] "RemoveContainer" containerID="ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e" Oct 02 16:19:29 crc kubenswrapper[4882]: I1002 16:19:29.760364 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:29 crc kubenswrapper[4882]: E1002 16:19:29.761294 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:30 crc kubenswrapper[4882]: I1002 16:19:30.521505 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/1.log" Oct 02 16:19:30 crc kubenswrapper[4882]: I1002 16:19:30.521810 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppcpg" event={"ID":"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2","Type":"ContainerStarted","Data":"7aa35c2662c0805faaf732c1db3dfb6ccb2d6c73ddd56a45203e524734283fef"} Oct 02 16:19:30 crc kubenswrapper[4882]: I1002 16:19:30.760175 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:30 crc kubenswrapper[4882]: E1002 16:19:30.760451 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:31 crc kubenswrapper[4882]: I1002 16:19:31.759813 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:31 crc kubenswrapper[4882]: I1002 16:19:31.759877 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:31 crc kubenswrapper[4882]: E1002 16:19:31.759996 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 16:19:31 crc kubenswrapper[4882]: I1002 16:19:31.760021 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:31 crc kubenswrapper[4882]: E1002 16:19:31.760163 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 16:19:31 crc kubenswrapper[4882]: E1002 16:19:31.760357 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 16:19:32 crc kubenswrapper[4882]: I1002 16:19:32.759380 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:32 crc kubenswrapper[4882]: E1002 16:19:32.760944 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ldvk" podUID="9f988cab-7579-4a12-8df6-e3e91e42f7df" Oct 02 16:19:33 crc kubenswrapper[4882]: I1002 16:19:33.759293 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:33 crc kubenswrapper[4882]: I1002 16:19:33.759371 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:33 crc kubenswrapper[4882]: I1002 16:19:33.759470 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:33 crc kubenswrapper[4882]: I1002 16:19:33.763736 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 16:19:33 crc kubenswrapper[4882]: I1002 16:19:33.764487 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 16:19:33 crc kubenswrapper[4882]: I1002 16:19:33.764640 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 16:19:33 crc kubenswrapper[4882]: I1002 16:19:33.765831 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 16:19:34 crc kubenswrapper[4882]: I1002 16:19:34.759887 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:19:34 crc kubenswrapper[4882]: I1002 16:19:34.762998 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 16:19:34 crc kubenswrapper[4882]: I1002 16:19:34.767110 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 16:19:37 crc kubenswrapper[4882]: I1002 16:19:37.489658 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.880289 4882 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.921009 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dnsbk"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.923002 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.930352 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.932253 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.935296 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.944625 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-snzmw"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.948262 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.949033 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.952050 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.952330 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.952589 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.952758 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.952851 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.955582 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.956787 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9snj8"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.957017 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.957444 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-scbzd"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.957919 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.958084 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.958869 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.959082 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.959581 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wcjdw"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.960415 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.960459 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.960444 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.960436 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.961341 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.962034 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.962248 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9f6pj"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.964650 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.964869 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.965378 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.965433 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.965548 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.965730 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.965740 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.965889 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.965913 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.965959 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.966009 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.966091 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.966113 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.966134 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.966253 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.966296 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.966365 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.966371 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.967643 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.967852 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.967990 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.968250 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.968414 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.968554 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.968680 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.968804 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.968939 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.969052 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.969088 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.969180 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.969764 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.970492 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.970698 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.970827 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.970924 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.971006 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.971108 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.971176 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.971298 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.972694 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.973796 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.974690 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.975295 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.975300 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.975755 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.975961 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.975982 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.975961 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.976720 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.983200 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.983599 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.984192 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.984201 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mw2kv"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.984690 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.984856 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.984865 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6n7gp"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.985616 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6n7gp" Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.986803 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.987585 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dnsbk"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.996933 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl2p9"] Oct 02 16:19:40 crc kubenswrapper[4882]: I1002 16:19:40.997590 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008001 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-image-import-ca\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008053 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-config\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008096 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzm9l\" (UniqueName: \"kubernetes.io/projected/c5af616c-8948-402c-97b8-3aadd17673d2-kube-api-access-lzm9l\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008194 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-encryption-config\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008239 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-oauth-config\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008264 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-etcd-serving-ca\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008290 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfd762f-4be9-49a4-9851-f3211e11e6ad-serving-cert\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008316 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008340 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-config\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008359 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-service-ca\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008387 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-oauth-serving-cert\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008414 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-serving-cert\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008440 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjp9g\" (UniqueName: \"kubernetes.io/projected/ff38de28-245a-4acd-b148-e2b71a457eff-kube-api-access-jjp9g\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008470 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-audit\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008491 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff38de28-245a-4acd-b148-e2b71a457eff-audit-dir\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008515 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-etcd-client\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008540 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008563 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-serving-cert\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008659 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-client-ca\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008680 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgwv6\" (UniqueName: \"kubernetes.io/projected/6cfd762f-4be9-49a4-9851-f3211e11e6ad-kube-api-access-pgwv6\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008707 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-console-config\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008746 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff38de28-245a-4acd-b148-e2b71a457eff-node-pullsecrets\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.008770 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-trusted-ca-bundle\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.009622 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.009959 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.010671 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.010877 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.018568 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019149 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019209 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019235 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.018568 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019565 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019609 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019686 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019729 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019890 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.019611 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020198 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020257 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020448 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020532 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020573 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020652 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020710 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020765 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020829 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.020874 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.021003 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.023544 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.024329 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.025432 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.027487 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.029261 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.029441 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.063621 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.063877 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.063979 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.064408 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.065135 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.065469 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.065494 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.065605 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.072867 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.077852 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.083129 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.083174 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.083403 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-scbzd"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.088456 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.090842 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.091801 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6wpc8"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.092691 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.093079 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.094032 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.094595 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.095294 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.096856 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.099086 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.099374 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.100914 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.101245 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.101309 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.101577 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.101780 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.103063 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.104688 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.107268 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9f6pj"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.107315 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6n7gp"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.108148 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-snzmw"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.108328 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.114789 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-etcd-serving-ca\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.114860 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-config\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.114900 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-service-ca\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.114939 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfd762f-4be9-49a4-9851-f3211e11e6ad-serving-cert\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.114977 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115016 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-oauth-serving-cert\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115054 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-serving-cert\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115088 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-audit\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115132 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff38de28-245a-4acd-b148-e2b71a457eff-audit-dir\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115179 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjp9g\" (UniqueName: \"kubernetes.io/projected/ff38de28-245a-4acd-b148-e2b71a457eff-kube-api-access-jjp9g\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115233 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-etcd-client\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115271 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115302 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-serving-cert\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115574 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgwv6\" (UniqueName: \"kubernetes.io/projected/6cfd762f-4be9-49a4-9851-f3211e11e6ad-kube-api-access-pgwv6\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115605 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-console-config\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115660 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-client-ca\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115716 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff38de28-245a-4acd-b148-e2b71a457eff-node-pullsecrets\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115755 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-trusted-ca-bundle\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115790 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-image-import-ca\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115821 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-config\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115855 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzm9l\" (UniqueName: \"kubernetes.io/projected/c5af616c-8948-402c-97b8-3aadd17673d2-kube-api-access-lzm9l\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115891 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-encryption-config\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.115929 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-oauth-config\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.126888 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wcjdw"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.126971 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.126982 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff38de28-245a-4acd-b148-e2b71a457eff-audit-dir\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.127571 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ff38de28-245a-4acd-b148-e2b71a457eff-node-pullsecrets\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.138513 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.142888 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.149550 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9snj8"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.160021 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.161812 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.163077 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.170366 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.174409 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.175227 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vf2qc"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.175617 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.175892 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.177356 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.178056 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.178464 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.179568 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.181032 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.185634 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.185856 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.187044 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z2v5c"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.187690 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9t27z"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.188237 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lgptr"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.188422 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.188604 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.189189 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.189252 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.196923 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.197816 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.202510 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.203555 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.203663 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.205137 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.205998 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.207626 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.208603 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d972j"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.208903 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.209548 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fqvp9"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.209625 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.214460 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.214614 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.214660 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl2p9"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.214701 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.214585 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.214961 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.217636 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hmk79"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.218543 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.221318 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z2v5c"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.222888 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.224247 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6wpc8"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.225228 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.225917 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.227196 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.228826 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.230773 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.232035 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.233200 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mw2kv"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.234158 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.235280 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.236361 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d972j"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.237465 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.238902 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lgptr"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.240950 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.242279 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.243554 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.245266 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.246517 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9t27z"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.246873 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.248118 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.310456 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.311329 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.311451 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.311887 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:41.81186651 +0000 UTC m=+140.561096047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.316884 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.321678 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fqvp9"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.324980 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x5zj4"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.325554 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-client-ca\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.326731 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.327154 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.329582 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zjj8d"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.330240 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zjj8d" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.335597 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zjj8d"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.339947 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x5zj4"] Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.343877 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.344205 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-config\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.344829 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-encryption-config\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.344935 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-oauth-serving-cert\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.345041 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-audit\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.345201 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-etcd-client\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.345260 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-etcd-serving-ca\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.345714 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff38de28-245a-4acd-b148-e2b71a457eff-serving-cert\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.345821 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-serving-cert\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.345745 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-oauth-config\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.346127 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-console-config\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.346444 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.346601 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.347030 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-trusted-ca-bundle\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.347398 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-service-ca\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.348206 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ff38de28-245a-4acd-b148-e2b71a457eff-image-import-ca\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.366631 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.383509 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfd762f-4be9-49a4-9851-f3211e11e6ad-serving-cert\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.386270 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.386489 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-config\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.406405 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412158 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.412301 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:41.91227805 +0000 UTC m=+140.661507577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412508 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb4771a-d742-436a-964c-c19d1ac905d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412563 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pq49\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-kube-api-access-5pq49\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412603 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed430b79-ad1a-456d-be2f-6cb51f2564dc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412640 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb409fa4-c522-4729-9c26-11d24aab19e3-config\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412676 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-audit-policies\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412708 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ec64748-14d0-4078-a51e-deee8610c82f-audit-dir\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412739 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7886091e-37f9-44f7-b04d-a210a82a62a8-machine-approver-tls\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412773 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2sx9\" (UniqueName: \"kubernetes.io/projected/7886091e-37f9-44f7-b04d-a210a82a62a8-kube-api-access-j2sx9\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412798 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb409fa4-c522-4729-9c26-11d24aab19e3-trusted-ca\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412834 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb4771a-d742-436a-964c-c19d1ac905d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412881 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-tls\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.412994 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.413023 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-policies\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.413044 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.413063 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.413087 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7886091e-37f9-44f7-b04d-a210a82a62a8-auth-proxy-config\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.413105 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-serving-cert\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.414954 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415009 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvslk\" (UniqueName: \"kubernetes.io/projected/41edc773-9f85-408f-9605-a86000b41aa2-kube-api-access-wvslk\") pod \"downloads-7954f5f757-6n7gp\" (UID: \"41edc773-9f85-408f-9605-a86000b41aa2\") " pod="openshift-console/downloads-7954f5f757-6n7gp" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415034 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415107 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-service-ca-bundle\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415166 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-dir\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415204 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415276 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-etcd-client\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415333 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkzdz\" (UniqueName: \"kubernetes.io/projected/ed430b79-ad1a-456d-be2f-6cb51f2564dc-kube-api-access-vkzdz\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415396 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-client-ca\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415428 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-serving-cert\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415457 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415488 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415518 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk2d5\" (UniqueName: \"kubernetes.io/projected/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-kube-api-access-hk2d5\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415543 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0133093-ff9a-4741-abf2-746361b98451-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415735 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdkk8\" (UniqueName: \"kubernetes.io/projected/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-kube-api-access-kdkk8\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415767 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7886091e-37f9-44f7-b04d-a210a82a62a8-config\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415796 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-config\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415816 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415836 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb409fa4-c522-4729-9c26-11d24aab19e3-serving-cert\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415858 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc2f6\" (UniqueName: \"kubernetes.io/projected/f0133093-ff9a-4741-abf2-746361b98451-kube-api-access-pc2f6\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415880 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxgsw\" (UniqueName: \"kubernetes.io/projected/e34155fb-8fe4-480a-aa87-518c3e344cab-kube-api-access-pxgsw\") pod \"cluster-samples-operator-665b6dd947-nwqqf\" (UID: \"e34155fb-8fe4-480a-aa87-518c3e344cab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.415983 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhp6t\" (UniqueName: \"kubernetes.io/projected/92e16305-ea70-49fd-b269-0e36792ee6ea-kube-api-access-hhp6t\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416077 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416108 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-serving-cert\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416133 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58g45\" (UniqueName: \"kubernetes.io/projected/3eb4771a-d742-436a-964c-c19d1ac905d6-kube-api-access-58g45\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416166 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-certificates\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416193 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed430b79-ad1a-456d-be2f-6cb51f2564dc-config\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416239 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e34155fb-8fe4-480a-aa87-518c3e344cab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nwqqf\" (UID: \"e34155fb-8fe4-480a-aa87-518c3e344cab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416276 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416443 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a182d0c0-45a9-450b-affc-44caf339abd8-serving-cert\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416536 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-encryption-config\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416588 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2fkm\" (UniqueName: \"kubernetes.io/projected/0ec64748-14d0-4078-a51e-deee8610c82f-kube-api-access-c2fkm\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416611 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74p89\" (UniqueName: \"kubernetes.io/projected/a182d0c0-45a9-450b-affc-44caf339abd8-kube-api-access-74p89\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416647 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416676 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-config\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416696 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416716 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416743 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416765 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed430b79-ad1a-456d-be2f-6cb51f2564dc-images\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416794 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-bound-sa-token\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416843 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.416863 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0133093-ff9a-4741-abf2-746361b98451-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.417083 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-trusted-ca\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.417141 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5cmx\" (UniqueName: \"kubernetes.io/projected/cb409fa4-c522-4729-9c26-11d24aab19e3-kube-api-access-f5cmx\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.417193 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.417269 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0133093-ff9a-4741-abf2-746361b98451-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.417375 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.417729 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:41.917712169 +0000 UTC m=+140.666941696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.426014 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.466274 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.486938 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.506399 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518350 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.518542 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.018502049 +0000 UTC m=+140.767731576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518642 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-config\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518721 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518775 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkxm\" (UniqueName: \"kubernetes.io/projected/09a298da-0301-49c1-b755-1b5ec0058b3e-kube-api-access-hwkxm\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518809 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed430b79-ad1a-456d-be2f-6cb51f2564dc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518837 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-audit-policies\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518887 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ec64748-14d0-4078-a51e-deee8610c82f-audit-dir\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518914 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7886091e-37f9-44f7-b04d-a210a82a62a8-machine-approver-tls\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518938 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb409fa4-c522-4729-9c26-11d24aab19e3-trusted-ca\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.518992 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-default-certificate\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519024 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96b0dd64-7e97-4cac-bcd1-1cc312027a48-service-ca-bundle\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519050 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsx2d\" (UniqueName: \"kubernetes.io/projected/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-kube-api-access-nsx2d\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519080 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-service-ca\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519104 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1ed1366e-852f-4e88-ae80-6fc761781c31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fqvp9\" (UID: \"1ed1366e-852f-4e88-ae80-6fc761781c31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519129 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7886091e-37f9-44f7-b04d-a210a82a62a8-auth-proxy-config\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519159 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-serving-cert\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519184 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519237 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-service-ca-bundle\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519267 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-serving-cert\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519303 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-client\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519330 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13689e84-053e-4f97-8689-6a4c14800153-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519354 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-etcd-client\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519379 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-client-ca\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519406 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk2d5\" (UniqueName: \"kubernetes.io/projected/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-kube-api-access-hk2d5\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519462 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0133093-ff9a-4741-abf2-746361b98451-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.519560 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.019491085 +0000 UTC m=+140.768720652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519622 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ec64748-14d0-4078-a51e-deee8610c82f-audit-dir\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519625 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7886091e-37f9-44f7-b04d-a210a82a62a8-config\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.519700 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-webhook-cert\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520327 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb409fa4-c522-4729-9c26-11d24aab19e3-serving-cert\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520398 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dba2a584-8773-444f-a47f-d29bec61b7f1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520426 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxgsw\" (UniqueName: \"kubernetes.io/projected/e34155fb-8fe4-480a-aa87-518c3e344cab-kube-api-access-pxgsw\") pod \"cluster-samples-operator-665b6dd947-nwqqf\" (UID: \"e34155fb-8fe4-480a-aa87-518c3e344cab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520532 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhp6t\" (UniqueName: \"kubernetes.io/projected/92e16305-ea70-49fd-b269-0e36792ee6ea-kube-api-access-hhp6t\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520568 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgmb6\" (UniqueName: \"kubernetes.io/projected/4106b164-7d6a-4837-840b-f5a068e1aec9-kube-api-access-sgmb6\") pod \"dns-operator-744455d44c-9t27z\" (UID: \"4106b164-7d6a-4837-840b-f5a068e1aec9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520695 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-audit-policies\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520716 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-certificates\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520757 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-encryption-config\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.520911 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521071 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb409fa4-c522-4729-9c26-11d24aab19e3-trusted-ca\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521282 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13689e84-053e-4f97-8689-6a4c14800153-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521306 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f1c724-37a2-4524-8e7c-21fb617b124a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521341 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521380 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed430b79-ad1a-456d-be2f-6cb51f2564dc-images\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521400 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-config-volume\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521419 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4106b164-7d6a-4837-840b-f5a068e1aec9-metrics-tls\") pod \"dns-operator-744455d44c-9t27z\" (UID: \"4106b164-7d6a-4837-840b-f5a068e1aec9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521647 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-service-ca-bundle\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521794 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.521960 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-client-ca\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522029 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7886091e-37f9-44f7-b04d-a210a82a62a8-auth-proxy-config\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522084 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-bound-sa-token\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522140 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-node-bootstrap-token\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522167 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-certs\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522359 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522399 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0133093-ff9a-4741-abf2-746361b98451-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522534 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f1c724-37a2-4524-8e7c-21fb617b124a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522573 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13689e84-053e-4f97-8689-6a4c14800153-config\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522666 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7886091e-37f9-44f7-b04d-a210a82a62a8-config\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522713 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522854 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfctr\" (UniqueName: \"kubernetes.io/projected/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-kube-api-access-kfctr\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522902 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dba2a584-8773-444f-a47f-d29bec61b7f1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522936 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-ca\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.522975 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-trusted-ca\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.523005 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5cmx\" (UniqueName: \"kubernetes.io/projected/cb409fa4-c522-4729-9c26-11d24aab19e3-kube-api-access-f5cmx\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.523039 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.523066 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-secret-volume\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.523271 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-certificates\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.523764 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-tmpfs\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.523806 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pq49\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-kube-api-access-5pq49\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.523832 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb4771a-d742-436a-964c-c19d1ac905d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.523971 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-apiservice-cert\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524043 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-trusted-ca\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524156 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kln2\" (UniqueName: \"kubernetes.io/projected/dba2a584-8773-444f-a47f-d29bec61b7f1-kube-api-access-2kln2\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524188 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb409fa4-c522-4729-9c26-11d24aab19e3-config\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524237 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-stats-auth\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524253 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-metrics-certs\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524319 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2sx9\" (UniqueName: \"kubernetes.io/projected/7886091e-37f9-44f7-b04d-a210a82a62a8-kube-api-access-j2sx9\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524339 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb4771a-d742-436a-964c-c19d1ac905d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524358 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k87tk\" (UniqueName: \"kubernetes.io/projected/1ed1366e-852f-4e88-ae80-6fc761781c31-kube-api-access-k87tk\") pod \"multus-admission-controller-857f4d67dd-fqvp9\" (UID: \"1ed1366e-852f-4e88-ae80-6fc761781c31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524408 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-tls\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524430 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524506 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkssm\" (UniqueName: \"kubernetes.io/projected/c465e72e-8499-4dce-ab77-064f7ecb1c81-kube-api-access-dkssm\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524524 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c465e72e-8499-4dce-ab77-064f7ecb1c81-srv-cert\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524542 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-config-volume\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524565 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-policies\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524585 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524605 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524623 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvslk\" (UniqueName: \"kubernetes.io/projected/41edc773-9f85-408f-9605-a86000b41aa2-kube-api-access-wvslk\") pod \"downloads-7954f5f757-6n7gp\" (UID: \"41edc773-9f85-408f-9605-a86000b41aa2\") " pod="openshift-console/downloads-7954f5f757-6n7gp" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524644 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7krv\" (UniqueName: \"kubernetes.io/projected/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-kube-api-access-r7krv\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524730 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524749 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524769 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-serving-cert\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524786 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09a298da-0301-49c1-b755-1b5ec0058b3e-signing-key\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524813 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-dir\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524832 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b2cfab9-1657-44b4-b8d9-2bced2997338-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524850 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2cfab9-1657-44b4-b8d9-2bced2997338-config\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524866 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gxwr\" (UniqueName: \"kubernetes.io/projected/e8f1c724-37a2-4524-8e7c-21fb617b124a-kube-api-access-5gxwr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524896 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkzdz\" (UniqueName: \"kubernetes.io/projected/ed430b79-ad1a-456d-be2f-6cb51f2564dc-kube-api-access-vkzdz\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524914 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzzmh\" (UniqueName: \"kubernetes.io/projected/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-kube-api-access-nzzmh\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.524961 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb4771a-d742-436a-964c-c19d1ac905d6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525000 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525024 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525033 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525042 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqts\" (UniqueName: \"kubernetes.io/projected/638f69ac-9898-41f2-a2ee-0206e245db93-kube-api-access-ctqts\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525066 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bgtw\" (UniqueName: \"kubernetes.io/projected/a0bfe57e-f325-4e0f-910b-0f402074eb76-kube-api-access-9bgtw\") pod \"migrator-59844c95c7-x8fdq\" (UID: \"a0bfe57e-f325-4e0f-910b-0f402074eb76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525079 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-dir\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525087 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdkk8\" (UniqueName: \"kubernetes.io/projected/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-kube-api-access-kdkk8\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525113 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-config\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525132 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525150 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc2f6\" (UniqueName: \"kubernetes.io/projected/f0133093-ff9a-4741-abf2-746361b98451-kube-api-access-pc2f6\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525168 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09a298da-0301-49c1-b755-1b5ec0058b3e-signing-cabundle\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525202 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8kpc\" (UniqueName: \"kubernetes.io/projected/96b0dd64-7e97-4cac-bcd1-1cc312027a48-kube-api-access-l8kpc\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525238 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b2cfab9-1657-44b4-b8d9-2bced2997338-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525447 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-encryption-config\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525781 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.525807 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-policies\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526004 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526066 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526287 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526334 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-serving-cert\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526357 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58g45\" (UniqueName: \"kubernetes.io/projected/3eb4771a-d742-436a-964c-c19d1ac905d6-kube-api-access-58g45\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526381 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c465e72e-8499-4dce-ab77-064f7ecb1c81-profile-collector-cert\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526402 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed430b79-ad1a-456d-be2f-6cb51f2564dc-config\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526422 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e34155fb-8fe4-480a-aa87-518c3e344cab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nwqqf\" (UID: \"e34155fb-8fe4-480a-aa87-518c3e344cab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526440 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526462 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-metrics-tls\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526483 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a182d0c0-45a9-450b-affc-44caf339abd8-serving-cert\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526585 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb409fa4-c522-4729-9c26-11d24aab19e3-config\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526619 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0ec64748-14d0-4078-a51e-deee8610c82f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526938 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-config\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.526962 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-serving-cert\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.527262 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.527704 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.527748 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.527953 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.528199 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-serving-cert\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.528546 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.529065 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-tls\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.529235 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2fkm\" (UniqueName: \"kubernetes.io/projected/0ec64748-14d0-4078-a51e-deee8610c82f-kube-api-access-c2fkm\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.529269 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74p89\" (UniqueName: \"kubernetes.io/projected/a182d0c0-45a9-450b-affc-44caf339abd8-kube-api-access-74p89\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.529334 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.529365 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-config\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.529766 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.530456 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.530496 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6f5t\" (UniqueName: \"kubernetes.io/projected/53105ebf-8ac0-401a-8c49-b6c4780082e5-kube-api-access-r6f5t\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.530639 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.530726 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb409fa4-c522-4729-9c26-11d24aab19e3-serving-cert\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.530854 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb4771a-d742-436a-964c-c19d1ac905d6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.530953 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.531054 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0133093-ff9a-4741-abf2-746361b98451-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.531127 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0133093-ff9a-4741-abf2-746361b98451-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.531337 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.531651 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.531887 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-config\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.532597 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.533108 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.533336 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-serving-cert\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.533471 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7886091e-37f9-44f7-b04d-a210a82a62a8-machine-approver-tls\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.533999 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a182d0c0-45a9-450b-affc-44caf339abd8-serving-cert\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.535366 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e34155fb-8fe4-480a-aa87-518c3e344cab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nwqqf\" (UID: \"e34155fb-8fe4-480a-aa87-518c3e344cab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.535792 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ec64748-14d0-4078-a51e-deee8610c82f-etcd-client\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.535850 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0133093-ff9a-4741-abf2-746361b98451-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.537802 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed430b79-ad1a-456d-be2f-6cb51f2564dc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.540072 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.541606 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed430b79-ad1a-456d-be2f-6cb51f2564dc-images\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.541858 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed430b79-ad1a-456d-be2f-6cb51f2564dc-config\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.546634 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.566276 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.585833 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.606336 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.625929 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.632634 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.632968 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgmb6\" (UniqueName: \"kubernetes.io/projected/4106b164-7d6a-4837-840b-f5a068e1aec9-kube-api-access-sgmb6\") pod \"dns-operator-744455d44c-9t27z\" (UID: \"4106b164-7d6a-4837-840b-f5a068e1aec9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633011 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13689e84-053e-4f97-8689-6a4c14800153-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633037 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-config-volume\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633064 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f1c724-37a2-4524-8e7c-21fb617b124a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633089 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-node-bootstrap-token\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633111 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-certs\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633138 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4106b164-7d6a-4837-840b-f5a068e1aec9-metrics-tls\") pod \"dns-operator-744455d44c-9t27z\" (UID: \"4106b164-7d6a-4837-840b-f5a068e1aec9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633171 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633199 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13689e84-053e-4f97-8689-6a4c14800153-config\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633288 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f1c724-37a2-4524-8e7c-21fb617b124a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633326 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-secret-volume\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633346 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfctr\" (UniqueName: \"kubernetes.io/projected/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-kube-api-access-kfctr\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633365 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dba2a584-8773-444f-a47f-d29bec61b7f1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633384 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-ca\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633410 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-tmpfs\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633430 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-apiservice-cert\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633456 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kln2\" (UniqueName: \"kubernetes.io/projected/dba2a584-8773-444f-a47f-d29bec61b7f1-kube-api-access-2kln2\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633477 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-stats-auth\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633493 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-metrics-certs\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633510 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k87tk\" (UniqueName: \"kubernetes.io/projected/1ed1366e-852f-4e88-ae80-6fc761781c31-kube-api-access-k87tk\") pod \"multus-admission-controller-857f4d67dd-fqvp9\" (UID: \"1ed1366e-852f-4e88-ae80-6fc761781c31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633534 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkssm\" (UniqueName: \"kubernetes.io/projected/c465e72e-8499-4dce-ab77-064f7ecb1c81-kube-api-access-dkssm\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633555 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c465e72e-8499-4dce-ab77-064f7ecb1c81-srv-cert\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633578 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-config-volume\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633611 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7krv\" (UniqueName: \"kubernetes.io/projected/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-kube-api-access-r7krv\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633645 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-serving-cert\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633674 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09a298da-0301-49c1-b755-1b5ec0058b3e-signing-key\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.633762 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.133725533 +0000 UTC m=+140.882955250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633844 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b2cfab9-1657-44b4-b8d9-2bced2997338-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633889 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2cfab9-1657-44b4-b8d9-2bced2997338-config\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633944 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gxwr\" (UniqueName: \"kubernetes.io/projected/e8f1c724-37a2-4524-8e7c-21fb617b124a-kube-api-access-5gxwr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.633987 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqts\" (UniqueName: \"kubernetes.io/projected/638f69ac-9898-41f2-a2ee-0206e245db93-kube-api-access-ctqts\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634028 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bgtw\" (UniqueName: \"kubernetes.io/projected/a0bfe57e-f325-4e0f-910b-0f402074eb76-kube-api-access-9bgtw\") pod \"migrator-59844c95c7-x8fdq\" (UID: \"a0bfe57e-f325-4e0f-910b-0f402074eb76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634062 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzzmh\" (UniqueName: \"kubernetes.io/projected/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-kube-api-access-nzzmh\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634139 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09a298da-0301-49c1-b755-1b5ec0058b3e-signing-cabundle\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634178 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8kpc\" (UniqueName: \"kubernetes.io/projected/96b0dd64-7e97-4cac-bcd1-1cc312027a48-kube-api-access-l8kpc\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634242 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b2cfab9-1657-44b4-b8d9-2bced2997338-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634297 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c465e72e-8499-4dce-ab77-064f7ecb1c81-profile-collector-cert\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634347 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-metrics-tls\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634418 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6f5t\" (UniqueName: \"kubernetes.io/projected/53105ebf-8ac0-401a-8c49-b6c4780082e5-kube-api-access-r6f5t\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634430 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-tmpfs\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634449 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634533 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634570 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-config\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634616 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkxm\" (UniqueName: \"kubernetes.io/projected/09a298da-0301-49c1-b755-1b5ec0058b3e-kube-api-access-hwkxm\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634643 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-default-certificate\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634670 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-config-volume\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634675 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-service-ca\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634725 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96b0dd64-7e97-4cac-bcd1-1cc312027a48-service-ca-bundle\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634754 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsx2d\" (UniqueName: \"kubernetes.io/projected/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-kube-api-access-nsx2d\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634782 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1ed1366e-852f-4e88-ae80-6fc761781c31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fqvp9\" (UID: \"1ed1366e-852f-4e88-ae80-6fc761781c31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634810 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-client\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634837 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13689e84-053e-4f97-8689-6a4c14800153-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.634876 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-webhook-cert\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.634912 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.134896485 +0000 UTC m=+140.884126012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.635671 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dba2a584-8773-444f-a47f-d29bec61b7f1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.636032 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13689e84-053e-4f97-8689-6a4c14800153-config\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.636313 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f1c724-37a2-4524-8e7c-21fb617b124a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.636375 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dba2a584-8773-444f-a47f-d29bec61b7f1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.638012 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-metrics-tls\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.639426 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dba2a584-8773-444f-a47f-d29bec61b7f1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.639684 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13689e84-053e-4f97-8689-6a4c14800153-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.640047 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f1c724-37a2-4524-8e7c-21fb617b124a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.654587 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.667364 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.686158 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.698667 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b2cfab9-1657-44b4-b8d9-2bced2997338-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.726168 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.735396 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2cfab9-1657-44b4-b8d9-2bced2997338-config\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.736972 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.737590 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.237557376 +0000 UTC m=+140.986786903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.737958 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.738430 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.23840873 +0000 UTC m=+140.987638257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.742800 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjp9g\" (UniqueName: \"kubernetes.io/projected/ff38de28-245a-4acd-b148-e2b71a457eff-kube-api-access-jjp9g\") pod \"apiserver-76f77b778f-snzmw\" (UID: \"ff38de28-245a-4acd-b148-e2b71a457eff\") " pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.774847 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgwv6\" (UniqueName: \"kubernetes.io/projected/6cfd762f-4be9-49a4-9851-f3211e11e6ad-kube-api-access-pgwv6\") pod \"controller-manager-879f6c89f-dnsbk\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.786534 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.786640 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzm9l\" (UniqueName: \"kubernetes.io/projected/c5af616c-8948-402c-97b8-3aadd17673d2-kube-api-access-lzm9l\") pod \"console-f9d7485db-9f6pj\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.806715 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.826608 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.838586 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.838788 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.338757618 +0000 UTC m=+141.087987145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.839262 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.839689 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.339677713 +0000 UTC m=+141.088907260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.845968 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.847297 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.867302 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.880793 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-metrics-certs\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.886136 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.887382 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.907123 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.910403 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.919282 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-default-certificate\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.927687 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.939183 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/96b0dd64-7e97-4cac-bcd1-1cc312027a48-stats-auth\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.940284 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.940544 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.440489224 +0000 UTC m=+141.189718781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.941007 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:41 crc kubenswrapper[4882]: E1002 16:19:41.941703 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.441684277 +0000 UTC m=+141.190914024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.947874 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.957043 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96b0dd64-7e97-4cac-bcd1-1cc312027a48-service-ca-bundle\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.966258 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 16:19:41 crc kubenswrapper[4882]: I1002 16:19:41.990734 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.010460 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.026727 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.044013 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.044872 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.544849563 +0000 UTC m=+141.294079090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.046853 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.070559 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.079200 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-apiservice-cert\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.082129 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-webhook-cert\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.089557 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.106710 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.119314 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-secret-volume\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.121329 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c465e72e-8499-4dce-ab77-064f7ecb1c81-profile-collector-cert\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.126805 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.145651 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.146115 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.646091855 +0000 UTC m=+141.395321382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.146413 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.149845 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-snzmw"] Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.160369 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dnsbk"] Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.165625 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: W1002 16:19:42.170990 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cfd762f_4be9_49a4_9851_f3211e11e6ad.slice/crio-9f6fcc34e437806b6270f76476ffe445b9d2b9da9a75ae59e4b771d1714ab276 WatchSource:0}: Error finding container 9f6fcc34e437806b6270f76476ffe445b9d2b9da9a75ae59e4b771d1714ab276: Status 404 returned error can't find the container with id 9f6fcc34e437806b6270f76476ffe445b9d2b9da9a75ae59e4b771d1714ab276 Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.172032 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9f6pj"] Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.186227 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.203991 4882 request.go:700] Waited for 1.015137622s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dcollect-profiles-config&limit=500&resourceVersion=0 Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.205659 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.214720 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-config-volume\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.226864 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.245344 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.246348 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.246619 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.746590978 +0000 UTC m=+141.495820545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.246943 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.247490 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.747474331 +0000 UTC m=+141.496703898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.256480 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-config\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.266414 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.274918 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-ca\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.287305 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.307808 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.316182 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-service-ca\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.329430 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.346857 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.348600 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.349007 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.848972321 +0000 UTC m=+141.598201888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.349361 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.350003 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.849974839 +0000 UTC m=+141.599204406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.367182 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.378118 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4106b164-7d6a-4837-840b-f5a068e1aec9-metrics-tls\") pod \"dns-operator-744455d44c-9t27z\" (UID: \"4106b164-7d6a-4837-840b-f5a068e1aec9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.390590 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.407063 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.426642 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.446782 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.451967 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.452415 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.952384424 +0000 UTC m=+141.701613981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.453302 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.453837 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:42.953812022 +0000 UTC m=+141.703041589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.466849 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.479679 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09a298da-0301-49c1-b755-1b5ec0058b3e-signing-key\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.487340 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.500440 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-serving-cert\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.507527 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.525765 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.539707 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-etcd-client\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.545670 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.554435 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.554636 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.054602733 +0000 UTC m=+141.803832260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.555150 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.555342 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09a298da-0301-49c1-b755-1b5ec0058b3e-signing-cabundle\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.555526 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.055514987 +0000 UTC m=+141.804744804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.565972 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.567204 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" event={"ID":"ff38de28-245a-4acd-b148-e2b71a457eff","Type":"ContainerStarted","Data":"567e2cefdaec00acaf87deecaa1fbf8309aa93ff250015f1ccc86e59591efcd1"} Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.568896 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" event={"ID":"6cfd762f-4be9-49a4-9851-f3211e11e6ad","Type":"ContainerStarted","Data":"9f6fcc34e437806b6270f76476ffe445b9d2b9da9a75ae59e4b771d1714ab276"} Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.587583 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.597604 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c465e72e-8499-4dce-ab77-064f7ecb1c81-srv-cert\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.607077 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.626950 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 16:19:42 crc kubenswrapper[4882]: W1002 16:19:42.627311 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5af616c_8948_402c_97b8_3aadd17673d2.slice/crio-e5a6c601d68cc183fb950d1e9f8e066d7b58892ebdcdebe15e027613e0c3a051 WatchSource:0}: Error finding container e5a6c601d68cc183fb950d1e9f8e066d7b58892ebdcdebe15e027613e0c3a051: Status 404 returned error can't find the container with id e5a6c601d68cc183fb950d1e9f8e066d7b58892ebdcdebe15e027613e0c3a051 Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.633783 4882 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.633840 4882 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.633792 4882 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.633866 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-node-bootstrap-token podName:638f69ac-9898-41f2-a2ee-0206e245db93 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.133846245 +0000 UTC m=+141.883075772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-node-bootstrap-token") pod "machine-config-server-hmk79" (UID: "638f69ac-9898-41f2-a2ee-0206e245db93") : failed to sync secret cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.633977 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca podName:53105ebf-8ac0-401a-8c49-b6c4780082e5 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.133956468 +0000 UTC m=+141.883185995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca") pod "marketplace-operator-79b997595-d972j" (UID: "53105ebf-8ac0-401a-8c49-b6c4780082e5") : failed to sync configmap cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.634019 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-certs podName:638f69ac-9898-41f2-a2ee-0206e245db93 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.133991629 +0000 UTC m=+141.883221156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-certs") pod "machine-config-server-hmk79" (UID: "638f69ac-9898-41f2-a2ee-0206e245db93") : failed to sync secret cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.634699 4882 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.634781 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics podName:53105ebf-8ac0-401a-8c49-b6c4780082e5 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.13477209 +0000 UTC m=+141.884001617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics") pod "marketplace-operator-79b997595-d972j" (UID: "53105ebf-8ac0-401a-8c49-b6c4780082e5") : failed to sync secret cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.635176 4882 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.635260 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ed1366e-852f-4e88-ae80-6fc761781c31-webhook-certs podName:1ed1366e-852f-4e88-ae80-6fc761781c31 nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.135245533 +0000 UTC m=+141.884475140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1ed1366e-852f-4e88-ae80-6fc761781c31-webhook-certs") pod "multus-admission-controller-857f4d67dd-fqvp9" (UID: "1ed1366e-852f-4e88-ae80-6fc761781c31") : failed to sync secret cache: timed out waiting for the condition Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.646272 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.656705 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.656879 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.156861523 +0000 UTC m=+141.906091050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.658274 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.658707 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.158697003 +0000 UTC m=+141.907926530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.665874 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.693193 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.708242 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.726043 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.746876 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.759997 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.761514 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.261472417 +0000 UTC m=+142.010701944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.767991 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.787583 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.806826 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.826692 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.856140 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.862763 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.863362 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.363339707 +0000 UTC m=+142.112569234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.869456 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.886735 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.906173 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.928397 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.946961 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.964052 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.964298 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.464273882 +0000 UTC m=+142.213503399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.964630 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:42 crc kubenswrapper[4882]: E1002 16:19:42.964995 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.464983552 +0000 UTC m=+142.214213079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.966469 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 16:19:42 crc kubenswrapper[4882]: I1002 16:19:42.985620 4882 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.006571 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.026113 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.046611 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.065861 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.066073 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.566040339 +0000 UTC m=+142.315269866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.066267 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.066815 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.067355 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.567346305 +0000 UTC m=+142.316575842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.086488 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.165711 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk2d5\" (UniqueName: \"kubernetes.io/projected/a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3-kube-api-access-hk2d5\") pod \"authentication-operator-69f744f599-scbzd\" (UID: \"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.167595 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.167909 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.667860228 +0000 UTC m=+142.417089785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.168308 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.168444 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.168571 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1ed1366e-852f-4e88-ae80-6fc761781c31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fqvp9\" (UID: \"1ed1366e-852f-4e88-ae80-6fc761781c31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.168741 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-node-bootstrap-token\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.168845 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-certs\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.168996 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.169323 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.669230735 +0000 UTC m=+142.418460462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.171168 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.172172 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.172792 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1ed1366e-852f-4e88-ae80-6fc761781c31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fqvp9\" (UID: \"1ed1366e-852f-4e88-ae80-6fc761781c31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.173080 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-node-bootstrap-token\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.173809 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/638f69ac-9898-41f2-a2ee-0206e245db93-certs\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.190275 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxgsw\" (UniqueName: \"kubernetes.io/projected/e34155fb-8fe4-480a-aa87-518c3e344cab-kube-api-access-pxgsw\") pod \"cluster-samples-operator-665b6dd947-nwqqf\" (UID: \"e34155fb-8fe4-480a-aa87-518c3e344cab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.204264 4882 request.go:700] Waited for 1.681918069s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/registry/token Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.213510 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhp6t\" (UniqueName: \"kubernetes.io/projected/92e16305-ea70-49fd-b269-0e36792ee6ea-kube-api-access-hhp6t\") pod \"oauth-openshift-558db77b4-wcjdw\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.229554 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-bound-sa-token\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.256498 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5cmx\" (UniqueName: \"kubernetes.io/projected/cb409fa4-c522-4729-9c26-11d24aab19e3-kube-api-access-f5cmx\") pod \"console-operator-58897d9998-mw2kv\" (UID: \"cb409fa4-c522-4729-9c26-11d24aab19e3\") " pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.259936 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.268870 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pq49\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-kube-api-access-5pq49\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.270750 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.270928 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.770887799 +0000 UTC m=+142.520117366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.271557 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.272285 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.772256276 +0000 UTC m=+142.521485993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.287366 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkzdz\" (UniqueName: \"kubernetes.io/projected/ed430b79-ad1a-456d-be2f-6cb51f2564dc-kube-api-access-vkzdz\") pod \"machine-api-operator-5694c8668f-9snj8\" (UID: \"ed430b79-ad1a-456d-be2f-6cb51f2564dc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.303469 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2sx9\" (UniqueName: \"kubernetes.io/projected/7886091e-37f9-44f7-b04d-a210a82a62a8-kube-api-access-j2sx9\") pod \"machine-approver-56656f9798-t52zc\" (UID: \"7886091e-37f9-44f7-b04d-a210a82a62a8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.327528 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc2f6\" (UniqueName: \"kubernetes.io/projected/f0133093-ff9a-4741-abf2-746361b98451-kube-api-access-pc2f6\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.357405 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdkk8\" (UniqueName: \"kubernetes.io/projected/12d9299c-b3ee-40b9-a2d6-56159ba9ff66-kube-api-access-kdkk8\") pod \"openshift-config-operator-7777fb866f-xhcsv\" (UID: \"12d9299c-b3ee-40b9-a2d6-56159ba9ff66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.367290 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58g45\" (UniqueName: \"kubernetes.io/projected/3eb4771a-d742-436a-964c-c19d1ac905d6-kube-api-access-58g45\") pod \"openshift-apiserver-operator-796bbdcf4f-xtxmh\" (UID: \"3eb4771a-d742-436a-964c-c19d1ac905d6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.372859 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.373387 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.873269133 +0000 UTC m=+142.622498700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.373891 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.374431 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.874414484 +0000 UTC m=+142.623644021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.386761 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvslk\" (UniqueName: \"kubernetes.io/projected/41edc773-9f85-408f-9605-a86000b41aa2-kube-api-access-wvslk\") pod \"downloads-7954f5f757-6n7gp\" (UID: \"41edc773-9f85-408f-9605-a86000b41aa2\") " pod="openshift-console/downloads-7954f5f757-6n7gp" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.415721 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74p89\" (UniqueName: \"kubernetes.io/projected/a182d0c0-45a9-450b-affc-44caf339abd8-kube-api-access-74p89\") pod \"route-controller-manager-6576b87f9c-mfzth\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.423271 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.424652 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2fkm\" (UniqueName: \"kubernetes.io/projected/0ec64748-14d0-4078-a51e-deee8610c82f-kube-api-access-c2fkm\") pod \"apiserver-7bbb656c7d-jbrgv\" (UID: \"0ec64748-14d0-4078-a51e-deee8610c82f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.444171 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.445337 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6n7gp" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.445505 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.445613 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.445832 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.446089 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.474517 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.476149 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.476747 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.476964 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.976924271 +0000 UTC m=+142.726153828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.477206 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.478113 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:43.978078233 +0000 UTC m=+142.727307790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.496738 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wcjdw"] Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.497659 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkssm\" (UniqueName: \"kubernetes.io/projected/c465e72e-8499-4dce-ab77-064f7ecb1c81-kube-api-access-dkssm\") pod \"catalog-operator-68c6474976-mbjzj\" (UID: \"c465e72e-8499-4dce-ab77-064f7ecb1c81\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.508124 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgmb6\" (UniqueName: \"kubernetes.io/projected/4106b164-7d6a-4837-840b-f5a068e1aec9-kube-api-access-sgmb6\") pod \"dns-operator-744455d44c-9t27z\" (UID: \"4106b164-7d6a-4837-840b-f5a068e1aec9\") " pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.525113 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.525824 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0133093-ff9a-4741-abf2-746361b98451-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ztdvl\" (UID: \"f0133093-ff9a-4741-abf2-746361b98451\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.529463 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfctr\" (UniqueName: \"kubernetes.io/projected/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-kube-api-access-kfctr\") pod \"collect-profiles-29323695-6xdzv\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.532041 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kln2\" (UniqueName: \"kubernetes.io/projected/dba2a584-8773-444f-a47f-d29bec61b7f1-kube-api-access-2kln2\") pod \"kube-storage-version-migrator-operator-b67b599dd-nrq6m\" (UID: \"dba2a584-8773-444f-a47f-d29bec61b7f1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.548250 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gxwr\" (UniqueName: \"kubernetes.io/projected/e8f1c724-37a2-4524-8e7c-21fb617b124a-kube-api-access-5gxwr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7n6gf\" (UID: \"e8f1c724-37a2-4524-8e7c-21fb617b124a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.560909 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7krv\" (UniqueName: \"kubernetes.io/projected/e3e5c6ee-e13e-40e9-8c61-b0ac10d62323-kube-api-access-r7krv\") pod \"packageserver-d55dfcdfc-6xht6\" (UID: \"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.574932 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" event={"ID":"6cfd762f-4be9-49a4-9851-f3211e11e6ad","Type":"ContainerStarted","Data":"818b28340b1678d2be5b3cb17613fee6374ca8a6dd5566988b18415a113cbbd8"} Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.576654 4882 generic.go:334] "Generic (PLEG): container finished" podID="ff38de28-245a-4acd-b148-e2b71a457eff" containerID="60f7a33274a1870b8ddaa42dfd41cc0eb49c85bfcb55ae6aa76ce1c1daf17d36" exitCode=0 Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.576787 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" event={"ID":"ff38de28-245a-4acd-b148-e2b71a457eff","Type":"ContainerDied","Data":"60f7a33274a1870b8ddaa42dfd41cc0eb49c85bfcb55ae6aa76ce1c1daf17d36"} Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.578711 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.578776 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.07875782 +0000 UTC m=+142.827987347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.580331 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzzmh\" (UniqueName: \"kubernetes.io/projected/1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7-kube-api-access-nzzmh\") pod \"etcd-operator-b45778765-lgptr\" (UID: \"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.590688 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.591517 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9f6pj" event={"ID":"c5af616c-8948-402c-97b8-3aadd17673d2","Type":"ContainerStarted","Data":"4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0"} Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.591565 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9f6pj" event={"ID":"c5af616c-8948-402c-97b8-3aadd17673d2","Type":"ContainerStarted","Data":"e5a6c601d68cc183fb950d1e9f8e066d7b58892ebdcdebe15e027613e0c3a051"} Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.592339 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.092317131 +0000 UTC m=+142.841546658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.592447 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" event={"ID":"92e16305-ea70-49fd-b269-0e36792ee6ea","Type":"ContainerStarted","Data":"4d20d602d3a38e3a4a8ae883375f2437c8415b6593fa26116362b5db3536438c"} Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.596851 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.600681 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.618763 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.618839 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bgtw\" (UniqueName: \"kubernetes.io/projected/a0bfe57e-f325-4e0f-910b-0f402074eb76-kube-api-access-9bgtw\") pod \"migrator-59844c95c7-x8fdq\" (UID: \"a0bfe57e-f325-4e0f-910b-0f402074eb76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.622811 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8kpc\" (UniqueName: \"kubernetes.io/projected/96b0dd64-7e97-4cac-bcd1-1cc312027a48-kube-api-access-l8kpc\") pod \"router-default-5444994796-vf2qc\" (UID: \"96b0dd64-7e97-4cac-bcd1-1cc312027a48\") " pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.635598 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh"] Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.642321 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k87tk\" (UniqueName: \"kubernetes.io/projected/1ed1366e-852f-4e88-ae80-6fc761781c31-kube-api-access-k87tk\") pod \"multus-admission-controller-857f4d67dd-fqvp9\" (UID: \"1ed1366e-852f-4e88-ae80-6fc761781c31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.676227 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.677770 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b2cfab9-1657-44b4-b8d9-2bced2997338-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zc5l4\" (UID: \"7b2cfab9-1657-44b4-b8d9-2bced2997338\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.692864 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.693380 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.193354208 +0000 UTC m=+142.942583735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.703621 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqts\" (UniqueName: \"kubernetes.io/projected/638f69ac-9898-41f2-a2ee-0206e245db93-kube-api-access-ctqts\") pod \"machine-config-server-hmk79\" (UID: \"638f69ac-9898-41f2-a2ee-0206e245db93\") " pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.722553 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6f5t\" (UniqueName: \"kubernetes.io/projected/53105ebf-8ac0-401a-8c49-b6c4780082e5-kube-api-access-r6f5t\") pod \"marketplace-operator-79b997595-d972j\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.724267 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkxm\" (UniqueName: \"kubernetes.io/projected/09a298da-0301-49c1-b755-1b5ec0058b3e-kube-api-access-hwkxm\") pod \"service-ca-9c57cc56f-z2v5c\" (UID: \"09a298da-0301-49c1-b755-1b5ec0058b3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.743568 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.746431 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.755869 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsx2d\" (UniqueName: \"kubernetes.io/projected/e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae-kube-api-access-nsx2d\") pod \"dns-default-6wpc8\" (UID: \"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae\") " pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.765476 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13689e84-053e-4f97-8689-6a4c14800153-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xvg9j\" (UID: \"13689e84-053e-4f97-8689-6a4c14800153\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.765722 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.774489 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.781009 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.789671 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795632 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wr8\" (UniqueName: \"kubernetes.io/projected/414ef5a5-d3ca-4879-a201-9b3f8f340740-kube-api-access-j5wr8\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795679 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxhpp\" (UniqueName: \"kubernetes.io/projected/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-kube-api-access-gxhpp\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795707 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18a563ac-8237-49c0-a830-23ad0150d1cb-serving-cert\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795763 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d8995298-5850-4a91-b913-d9b134818b45-images\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795791 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c101359-1d67-4282-bd53-04fb1d0a3977-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795827 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-socket-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795849 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/414ef5a5-d3ca-4879-a201-9b3f8f340740-srv-cert\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795906 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-registration-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.795925 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqjpk\" (UniqueName: \"kubernetes.io/projected/18a563ac-8237-49c0-a830-23ad0150d1cb-kube-api-access-kqjpk\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798485 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-csi-data-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798549 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6-cert\") pod \"ingress-canary-zjj8d\" (UID: \"8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6\") " pod="openshift-ingress-canary/ingress-canary-zjj8d" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798603 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6xbw\" (UniqueName: \"kubernetes.io/projected/0a678431-0e28-410a-867d-d2dcb8dcfe36-kube-api-access-s6xbw\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798623 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41af342-d652-4f68-adb6-563afb9ca544-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798654 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-plugins-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798687 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d41af342-d652-4f68-adb6-563afb9ca544-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798765 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ec03001-e8a9-4e0f-a20a-ad45cb0542c6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7cgt8\" (UID: \"5ec03001-e8a9-4e0f-a20a-ad45cb0542c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798812 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt4rt\" (UniqueName: \"kubernetes.io/projected/8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6-kube-api-access-bt4rt\") pod \"ingress-canary-zjj8d\" (UID: \"8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6\") " pod="openshift-ingress-canary/ingress-canary-zjj8d" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798839 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c101359-1d67-4282-bd53-04fb1d0a3977-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798871 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f855380f-e75e-4573-b8b1-b4d493210d4f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwrcp\" (UID: \"f855380f-e75e-4573-b8b1-b4d493210d4f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798935 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-mountpoint-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.798956 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a678431-0e28-410a-867d-d2dcb8dcfe36-proxy-tls\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799010 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8995298-5850-4a91-b913-d9b134818b45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799042 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799068 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpx5\" (UniqueName: \"kubernetes.io/projected/f855380f-e75e-4573-b8b1-b4d493210d4f-kube-api-access-2hpx5\") pod \"package-server-manager-789f6589d5-zwrcp\" (UID: \"f855380f-e75e-4573-b8b1-b4d493210d4f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799084 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41af342-d652-4f68-adb6-563afb9ca544-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799104 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/414ef5a5-d3ca-4879-a201-9b3f8f340740-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799123 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c101359-1d67-4282-bd53-04fb1d0a3977-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799146 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prgq\" (UniqueName: \"kubernetes.io/projected/5ec03001-e8a9-4e0f-a20a-ad45cb0542c6-kube-api-access-7prgq\") pod \"control-plane-machine-set-operator-78cbb6b69f-7cgt8\" (UID: \"5ec03001-e8a9-4e0f-a20a-ad45cb0542c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799203 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2rj6\" (UniqueName: \"kubernetes.io/projected/d8995298-5850-4a91-b913-d9b134818b45-kube-api-access-z2rj6\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799292 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a678431-0e28-410a-867d-d2dcb8dcfe36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799325 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8995298-5850-4a91-b913-d9b134818b45-proxy-tls\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799353 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnmlg\" (UniqueName: \"kubernetes.io/projected/3c101359-1d67-4282-bd53-04fb1d0a3977-kube-api-access-mnmlg\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.799375 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a563ac-8237-49c0-a830-23ad0150d1cb-config\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.799999 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.299983697 +0000 UTC m=+143.049213224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.813163 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.827068 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.860559 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.883533 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.892009 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mw2kv"] Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.900856 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.901102 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.401073426 +0000 UTC m=+143.150302953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901163 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c101359-1d67-4282-bd53-04fb1d0a3977-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901231 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prgq\" (UniqueName: \"kubernetes.io/projected/5ec03001-e8a9-4e0f-a20a-ad45cb0542c6-kube-api-access-7prgq\") pod \"control-plane-machine-set-operator-78cbb6b69f-7cgt8\" (UID: \"5ec03001-e8a9-4e0f-a20a-ad45cb0542c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901351 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2rj6\" (UniqueName: \"kubernetes.io/projected/d8995298-5850-4a91-b913-d9b134818b45-kube-api-access-z2rj6\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901413 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a678431-0e28-410a-867d-d2dcb8dcfe36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901441 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8995298-5850-4a91-b913-d9b134818b45-proxy-tls\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901461 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmlg\" (UniqueName: \"kubernetes.io/projected/3c101359-1d67-4282-bd53-04fb1d0a3977-kube-api-access-mnmlg\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901493 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a563ac-8237-49c0-a830-23ad0150d1cb-config\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901534 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wr8\" (UniqueName: \"kubernetes.io/projected/414ef5a5-d3ca-4879-a201-9b3f8f340740-kube-api-access-j5wr8\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901555 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxhpp\" (UniqueName: \"kubernetes.io/projected/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-kube-api-access-gxhpp\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901577 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18a563ac-8237-49c0-a830-23ad0150d1cb-serving-cert\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901619 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d8995298-5850-4a91-b913-d9b134818b45-images\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901672 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c101359-1d67-4282-bd53-04fb1d0a3977-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901708 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-socket-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901743 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/414ef5a5-d3ca-4879-a201-9b3f8f340740-srv-cert\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901826 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-registration-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.901853 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqjpk\" (UniqueName: \"kubernetes.io/projected/18a563ac-8237-49c0-a830-23ad0150d1cb-kube-api-access-kqjpk\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902008 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-csi-data-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902075 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6-cert\") pod \"ingress-canary-zjj8d\" (UID: \"8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6\") " pod="openshift-ingress-canary/ingress-canary-zjj8d" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902245 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6xbw\" (UniqueName: \"kubernetes.io/projected/0a678431-0e28-410a-867d-d2dcb8dcfe36-kube-api-access-s6xbw\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902295 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41af342-d652-4f68-adb6-563afb9ca544-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902343 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-plugins-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902407 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d41af342-d652-4f68-adb6-563afb9ca544-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902436 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ec03001-e8a9-4e0f-a20a-ad45cb0542c6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7cgt8\" (UID: \"5ec03001-e8a9-4e0f-a20a-ad45cb0542c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902488 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt4rt\" (UniqueName: \"kubernetes.io/projected/8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6-kube-api-access-bt4rt\") pod \"ingress-canary-zjj8d\" (UID: \"8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6\") " pod="openshift-ingress-canary/ingress-canary-zjj8d" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902524 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f855380f-e75e-4573-b8b1-b4d493210d4f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwrcp\" (UID: \"f855380f-e75e-4573-b8b1-b4d493210d4f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902544 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c101359-1d67-4282-bd53-04fb1d0a3977-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902585 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-mountpoint-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902605 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a678431-0e28-410a-867d-d2dcb8dcfe36-proxy-tls\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902640 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8995298-5850-4a91-b913-d9b134818b45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902679 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902717 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpx5\" (UniqueName: \"kubernetes.io/projected/f855380f-e75e-4573-b8b1-b4d493210d4f-kube-api-access-2hpx5\") pod \"package-server-manager-789f6589d5-zwrcp\" (UID: \"f855380f-e75e-4573-b8b1-b4d493210d4f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902738 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41af342-d652-4f68-adb6-563afb9ca544-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.902757 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/414ef5a5-d3ca-4879-a201-9b3f8f340740-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.903313 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a563ac-8237-49c0-a830-23ad0150d1cb-config\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.903901 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a678431-0e28-410a-867d-d2dcb8dcfe36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.906309 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41af342-d652-4f68-adb6-563afb9ca544-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.906672 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-mountpoint-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.906852 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-socket-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.906940 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-plugins-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.907936 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d8995298-5850-4a91-b913-d9b134818b45-images\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.908400 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8995298-5850-4a91-b913-d9b134818b45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.908840 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.909133 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8995298-5850-4a91-b913-d9b134818b45-proxy-tls\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.909349 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-csi-data-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.909407 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-registration-dir\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:43 crc kubenswrapper[4882]: E1002 16:19:43.909701 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.409685181 +0000 UTC m=+143.158914708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.910030 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18a563ac-8237-49c0-a830-23ad0150d1cb-serving-cert\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.911078 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c101359-1d67-4282-bd53-04fb1d0a3977-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.912669 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c101359-1d67-4282-bd53-04fb1d0a3977-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.914723 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/414ef5a5-d3ca-4879-a201-9b3f8f340740-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.916063 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a678431-0e28-410a-867d-d2dcb8dcfe36-proxy-tls\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.919543 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/414ef5a5-d3ca-4879-a201-9b3f8f340740-srv-cert\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.919743 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ec03001-e8a9-4e0f-a20a-ad45cb0542c6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7cgt8\" (UID: \"5ec03001-e8a9-4e0f-a20a-ad45cb0542c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.920361 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f855380f-e75e-4573-b8b1-b4d493210d4f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwrcp\" (UID: \"f855380f-e75e-4573-b8b1-b4d493210d4f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.925889 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6-cert\") pod \"ingress-canary-zjj8d\" (UID: \"8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6\") " pod="openshift-ingress-canary/ingress-canary-zjj8d" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.937716 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41af342-d652-4f68-adb6-563afb9ca544-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.945118 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c101359-1d67-4282-bd53-04fb1d0a3977-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.964172 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prgq\" (UniqueName: \"kubernetes.io/projected/5ec03001-e8a9-4e0f-a20a-ad45cb0542c6-kube-api-access-7prgq\") pod \"control-plane-machine-set-operator-78cbb6b69f-7cgt8\" (UID: \"5ec03001-e8a9-4e0f-a20a-ad45cb0542c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.967400 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.983292 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2rj6\" (UniqueName: \"kubernetes.io/projected/d8995298-5850-4a91-b913-d9b134818b45-kube-api-access-z2rj6\") pod \"machine-config-operator-74547568cd-db4xr\" (UID: \"d8995298-5850-4a91-b913-d9b134818b45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:43 crc kubenswrapper[4882]: I1002 16:19:43.990515 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hmk79" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.003536 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.004540 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.504509518 +0000 UTC m=+143.253739045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.014579 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wr8\" (UniqueName: \"kubernetes.io/projected/414ef5a5-d3ca-4879-a201-9b3f8f340740-kube-api-access-j5wr8\") pod \"olm-operator-6b444d44fb-c4s5w\" (UID: \"414ef5a5-d3ca-4879-a201-9b3f8f340740\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.028415 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxhpp\" (UniqueName: \"kubernetes.io/projected/a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4-kube-api-access-gxhpp\") pod \"csi-hostpathplugin-x5zj4\" (UID: \"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4\") " pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.058100 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqjpk\" (UniqueName: \"kubernetes.io/projected/18a563ac-8237-49c0-a830-23ad0150d1cb-kube-api-access-kqjpk\") pod \"service-ca-operator-777779d784-jmqdb\" (UID: \"18a563ac-8237-49c0-a830-23ad0150d1cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.067682 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt4rt\" (UniqueName: \"kubernetes.io/projected/8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6-kube-api-access-bt4rt\") pod \"ingress-canary-zjj8d\" (UID: \"8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6\") " pod="openshift-ingress-canary/ingress-canary-zjj8d" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.087512 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnmlg\" (UniqueName: \"kubernetes.io/projected/3c101359-1d67-4282-bd53-04fb1d0a3977-kube-api-access-mnmlg\") pod \"ingress-operator-5b745b69d9-dbw9p\" (UID: \"3c101359-1d67-4282-bd53-04fb1d0a3977\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.104091 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6xbw\" (UniqueName: \"kubernetes.io/projected/0a678431-0e28-410a-867d-d2dcb8dcfe36-kube-api-access-s6xbw\") pod \"machine-config-controller-84d6567774-qhq56\" (UID: \"0a678431-0e28-410a-867d-d2dcb8dcfe36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.105566 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.106125 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.606103141 +0000 UTC m=+143.355332668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.120579 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.148684 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.151542 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpx5\" (UniqueName: \"kubernetes.io/projected/f855380f-e75e-4573-b8b1-b4d493210d4f-kube-api-access-2hpx5\") pod \"package-server-manager-789f6589d5-zwrcp\" (UID: \"f855380f-e75e-4573-b8b1-b4d493210d4f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.171590 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d41af342-d652-4f68-adb6-563afb9ca544-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vczsf\" (UID: \"d41af342-d652-4f68-adb6-563afb9ca544\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.172869 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.209360 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.209499 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.709475692 +0000 UTC m=+143.458705219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.209720 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.212342 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.71014305 +0000 UTC m=+143.459372577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.244703 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.248965 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.258074 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" Oct 02 16:19:44 crc kubenswrapper[4882]: W1002 16:19:44.272403 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod638f69ac_9898_41f2_a2ee_0206e245db93.slice/crio-dc4ca5522d3e0f848b1c8e6c1eade6165901d4f7a1ad93fbc3c40aafb999fac4 WatchSource:0}: Error finding container dc4ca5522d3e0f848b1c8e6c1eade6165901d4f7a1ad93fbc3c40aafb999fac4: Status 404 returned error can't find the container with id dc4ca5522d3e0f848b1c8e6c1eade6165901d4f7a1ad93fbc3c40aafb999fac4 Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.313241 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.313726 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.813708906 +0000 UTC m=+143.562938433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.320715 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.321722 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zjj8d" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.356841 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.398047 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.414819 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.415309 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:44.915291429 +0000 UTC m=+143.664520956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.518119 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.518637 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.018602207 +0000 UTC m=+143.767831874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.605694 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" event={"ID":"92e16305-ea70-49fd-b269-0e36792ee6ea","Type":"ContainerStarted","Data":"94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836"} Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.608256 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" event={"ID":"cb409fa4-c522-4729-9c26-11d24aab19e3","Type":"ContainerStarted","Data":"d912b7a84972e30fc08c873d2b79fe566708df6ffea02eb13dc66f49fc377a17"} Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.611918 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hmk79" event={"ID":"638f69ac-9898-41f2-a2ee-0206e245db93","Type":"ContainerStarted","Data":"dc4ca5522d3e0f848b1c8e6c1eade6165901d4f7a1ad93fbc3c40aafb999fac4"} Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.614508 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" event={"ID":"7886091e-37f9-44f7-b04d-a210a82a62a8","Type":"ContainerStarted","Data":"8d9b38c6c0e2cdc619124128aa8e6f00fbbc4294ca730d6d27cacdcb34b65798"} Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.614582 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" event={"ID":"7886091e-37f9-44f7-b04d-a210a82a62a8","Type":"ContainerStarted","Data":"24b8711aab50ebdc90a8dd883ef883539e97b257d0c402c7f2e80a9b97182e26"} Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.617271 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" event={"ID":"3eb4771a-d742-436a-964c-c19d1ac905d6","Type":"ContainerStarted","Data":"569157dd2df9b6788dfa63351a33260bd9f227a06e81fabc01120d060c29f7c8"} Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.617325 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" event={"ID":"3eb4771a-d742-436a-964c-c19d1ac905d6","Type":"ContainerStarted","Data":"a758ef971f4d6c9bb60c2d76232e96cfa745cd531e14c1d66fdcad68cd8fa40b"} Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.619019 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vf2qc" event={"ID":"96b0dd64-7e97-4cac-bcd1-1cc312027a48","Type":"ContainerStarted","Data":"aa5348f9ef61255ca940ee6b9da7c8f0f44d5c99fc60d665b6f44eba8729619e"} Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.619341 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.619423 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.619738 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.119722377 +0000 UTC m=+143.868951904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.697319 4882 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dnsbk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.697400 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" podUID="6cfd762f-4be9-49a4-9851-f3211e11e6ad" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.720010 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.720351 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.220311522 +0000 UTC m=+143.969541049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.720735 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.723542 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.22352418 +0000 UTC m=+143.972753707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.822494 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.822847 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.322815209 +0000 UTC m=+144.072044726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.823441 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.823952 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.323932649 +0000 UTC m=+144.073162176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.940973 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:44 crc kubenswrapper[4882]: E1002 16:19:44.941738 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.441720424 +0000 UTC m=+144.190949941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:44 crc kubenswrapper[4882]: I1002 16:19:44.978554 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-scbzd"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.047488 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.048099 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.548078597 +0000 UTC m=+144.297308124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.050270 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.111939 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6n7gp"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.139416 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.145105 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9snj8"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.152348 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.153313 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.153616 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.653584436 +0000 UTC m=+144.402813963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.153739 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.154474 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.65446684 +0000 UTC m=+144.403696367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.173869 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded430b79_ad1a_456d_be2f_6cb51f2564dc.slice/crio-153f26a82c1517dc6475f1dcc1a0c04808d881388c2a1e91a1560c09ba6d1327 WatchSource:0}: Error finding container 153f26a82c1517dc6475f1dcc1a0c04808d881388c2a1e91a1560c09ba6d1327: Status 404 returned error can't find the container with id 153f26a82c1517dc6475f1dcc1a0c04808d881388c2a1e91a1560c09ba6d1327 Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.180307 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12d9299c_b3ee_40b9_a2d6_56159ba9ff66.slice/crio-f6035d3ea4cc204b2c0c19292f6ba2d1f370f0150733e08043e5d4149fb88d53 WatchSource:0}: Error finding container f6035d3ea4cc204b2c0c19292f6ba2d1f370f0150733e08043e5d4149fb88d53: Status 404 returned error can't find the container with id f6035d3ea4cc204b2c0c19292f6ba2d1f370f0150733e08043e5d4149fb88d53 Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.255426 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.257238 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.757160982 +0000 UTC m=+144.506390509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.274264 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.297571 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6wpc8"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.304206 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.312826 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.314762 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl"] Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.318360 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc465e72e_8499_4dce_ab77_064f7ecb1c81.slice/crio-5e16dae2bd8453f2135328c67c00b0b47bfe429ef53da28527518f74b10c320d WatchSource:0}: Error finding container 5e16dae2bd8453f2135328c67c00b0b47bfe429ef53da28527518f74b10c320d: Status 404 returned error can't find the container with id 5e16dae2bd8453f2135328c67c00b0b47bfe429ef53da28527518f74b10c320d Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.318508 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.329324 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.337390 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9t27z"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.342199 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lgptr"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.360529 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.360962 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.860941684 +0000 UTC m=+144.610171211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.454272 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e5c6ee_e13e_40e9_8c61_b0ac10d62323.slice/crio-334ccb080f5ff7084da392b14552ffd29e20be5da8aa9237347ae3ce04f09f31 WatchSource:0}: Error finding container 334ccb080f5ff7084da392b14552ffd29e20be5da8aa9237347ae3ce04f09f31: Status 404 returned error can't find the container with id 334ccb080f5ff7084da392b14552ffd29e20be5da8aa9237347ae3ce04f09f31 Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.456279 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7c8d6a7_c7f0_4031_b9ab_70f4bc41b00f.slice/crio-4afc7db724b9fbd63dd8cd60e68b8d9b9a7fbcd73b215beb42de5732674624e3 WatchSource:0}: Error finding container 4afc7db724b9fbd63dd8cd60e68b8d9b9a7fbcd73b215beb42de5732674624e3: Status 404 returned error can't find the container with id 4afc7db724b9fbd63dd8cd60e68b8d9b9a7fbcd73b215beb42de5732674624e3 Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.461811 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4106b164_7d6a_4837_840b_f5a068e1aec9.slice/crio-703b275794844fc6820d2d47c74430b1aaaa0090538a0eb9e3eb4c2b17861110 WatchSource:0}: Error finding container 703b275794844fc6820d2d47c74430b1aaaa0090538a0eb9e3eb4c2b17861110: Status 404 returned error can't find the container with id 703b275794844fc6820d2d47c74430b1aaaa0090538a0eb9e3eb4c2b17861110 Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.467330 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.467451 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.96742688 +0000 UTC m=+144.716656427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.467903 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.473324 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:45.9732971 +0000 UTC m=+144.722526627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.578429 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.579006 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.078967104 +0000 UTC m=+144.828196631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.593146 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fqvp9"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.604320 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z2v5c"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.608980 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.618854 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.623618 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" podStartSLOduration=122.623595192 podStartE2EDuration="2m2.623595192s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:45.604459329 +0000 UTC m=+144.353688856" watchObservedRunningTime="2025-10-02 16:19:45.623595192 +0000 UTC m=+144.372824709" Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.629186 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" event={"ID":"c465e72e-8499-4dce-ab77-064f7ecb1c81","Type":"ContainerStarted","Data":"5e16dae2bd8453f2135328c67c00b0b47bfe429ef53da28527518f74b10c320d"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.630572 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zjj8d"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.642383 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vf2qc" event={"ID":"96b0dd64-7e97-4cac-bcd1-1cc312027a48","Type":"ContainerStarted","Data":"7cb915f0c57433969317477a198968d6b081d4e734d512f53daf17288cdc420d"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.649780 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9f6pj" podStartSLOduration=122.649754395 podStartE2EDuration="2m2.649754395s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:45.64661711 +0000 UTC m=+144.395846637" watchObservedRunningTime="2025-10-02 16:19:45.649754395 +0000 UTC m=+144.398983922" Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.665415 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" event={"ID":"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7","Type":"ContainerStarted","Data":"7a1e9282407eff6a4a889b24b209d2bb4180896680cc5ca59fd442c4bd2dd13c"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.683748 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.684084 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.184069352 +0000 UTC m=+144.933298879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.686449 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.695825 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.700152 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.702295 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.717803 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x5zj4"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.724203 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.726533 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.730008 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.742099 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.752527 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vf2qc" podStartSLOduration=121.752498159 podStartE2EDuration="2m1.752498159s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:45.732075472 +0000 UTC m=+144.481305019" watchObservedRunningTime="2025-10-02 16:19:45.752498159 +0000 UTC m=+144.501727686" Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.753031 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d972j"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.753091 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" event={"ID":"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f","Type":"ContainerStarted","Data":"4afc7db724b9fbd63dd8cd60e68b8d9b9a7fbcd73b215beb42de5732674624e3"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.756397 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.758173 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w"] Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.758390 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" event={"ID":"ed430b79-ad1a-456d-be2f-6cb51f2564dc","Type":"ContainerStarted","Data":"153f26a82c1517dc6475f1dcc1a0c04808d881388c2a1e91a1560c09ba6d1327"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.764619 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" event={"ID":"cb409fa4-c522-4729-9c26-11d24aab19e3","Type":"ContainerStarted","Data":"bb2d526b1a55470c502ddb42ff2bf245e1ec794722d93e816becf57bf059d08f"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.764899 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.766579 4882 patch_prober.go:28] interesting pod/console-operator-58897d9998-mw2kv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.766656 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" podUID="cb409fa4-c522-4729-9c26-11d24aab19e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.767721 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" event={"ID":"4106b164-7d6a-4837-840b-f5a068e1aec9","Type":"ContainerStarted","Data":"703b275794844fc6820d2d47c74430b1aaaa0090538a0eb9e3eb4c2b17861110"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.770605 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" event={"ID":"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3","Type":"ContainerStarted","Data":"42d0d43b21518c68ff919a370c797f327d1dad71dea2001f4c8a97521c16f3cc"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.776793 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6n7gp" event={"ID":"41edc773-9f85-408f-9605-a86000b41aa2","Type":"ContainerStarted","Data":"7f6e0e9e444f3b40d074c9dd8ff808bf701cbe8f5a0810aca1009c81dc1b37f1"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.782470 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wpc8" event={"ID":"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae","Type":"ContainerStarted","Data":"33638e355d6205a9dde441cb8b7737a918fd2b7598d089c181b40c3abcc46091"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.784966 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.785534 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.285444598 +0000 UTC m=+145.034674165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.797545 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" event={"ID":"12d9299c-b3ee-40b9-a2d6-56159ba9ff66","Type":"ContainerStarted","Data":"f6035d3ea4cc204b2c0c19292f6ba2d1f370f0150733e08043e5d4149fb88d53"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.801661 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" event={"ID":"a182d0c0-45a9-450b-affc-44caf339abd8","Type":"ContainerStarted","Data":"b66e511f441625a6fb0032834d0384a54a6f0bdecb4c1f639ba11c2c704d7022"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.803311 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" event={"ID":"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323","Type":"ContainerStarted","Data":"334ccb080f5ff7084da392b14552ffd29e20be5da8aa9237347ae3ce04f09f31"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.807743 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" event={"ID":"e8f1c724-37a2-4524-8e7c-21fb617b124a","Type":"ContainerStarted","Data":"48d67783312f8c1c2df466eeb161a69be25df2da9011ba8265fe7c0a8fd7a1d8"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.811851 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" event={"ID":"f0133093-ff9a-4741-abf2-746361b98451","Type":"ContainerStarted","Data":"563800bfabd318a85df78a1f7af364c5e4b5767a96ae6a912addfa629454eba4"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.812715 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" event={"ID":"0ec64748-14d0-4078-a51e-deee8610c82f","Type":"ContainerStarted","Data":"adc37d2cbc8c01a87ab7b28235ff47a2abd9b2d43b5b5e3725235983e0c53e37"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.817635 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" event={"ID":"ff38de28-245a-4acd-b148-e2b71a457eff","Type":"ContainerStarted","Data":"f80897b9818e586b8e8b54c9c2d175894b5269e965bc80cc58d887b6aa690634"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.820363 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hmk79" event={"ID":"638f69ac-9898-41f2-a2ee-0206e245db93","Type":"ContainerStarted","Data":"17caa879dd89e9d1e55d7f71e0f107024b13b88ae0a3a0ecf1b48e26997a2727"} Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.821063 4882 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dnsbk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.821118 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" podUID="6cfd762f-4be9-49a4-9851-f3211e11e6ad" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.821396 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.823061 4882 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wcjdw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.823112 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" podUID="92e16305-ea70-49fd-b269-0e36792ee6ea" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.829307 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.829600 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41af342_d652_4f68_adb6_563afb9ca544.slice/crio-b4e3ce8e8cd91697453465397a148af877d461572de106f819253bf0283de4c7 WatchSource:0}: Error finding container b4e3ce8e8cd91697453465397a148af877d461572de106f819253bf0283de4c7: Status 404 returned error can't find the container with id b4e3ce8e8cd91697453465397a148af877d461572de106f819253bf0283de4c7 Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.838322 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.838382 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.875828 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a563ac_8237_49c0_a830_23ad0150d1cb.slice/crio-dd26d5c9f9cdb8d8831a9e1abb8b7df068ccb1a13377f51e53d7f24832871d2a WatchSource:0}: Error finding container dd26d5c9f9cdb8d8831a9e1abb8b7df068ccb1a13377f51e53d7f24832871d2a: Status 404 returned error can't find the container with id dd26d5c9f9cdb8d8831a9e1abb8b7df068ccb1a13377f51e53d7f24832871d2a Oct 02 16:19:45 crc kubenswrapper[4882]: W1002 16:19:45.880424 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddba2a584_8773_444f_a47f_d29bec61b7f1.slice/crio-3e8d6dabbc163b4c463a56d4f88a08dca0181cfd05d7de96caa1f2800b53d0de WatchSource:0}: Error finding container 3e8d6dabbc163b4c463a56d4f88a08dca0181cfd05d7de96caa1f2800b53d0de: Status 404 returned error can't find the container with id 3e8d6dabbc163b4c463a56d4f88a08dca0181cfd05d7de96caa1f2800b53d0de Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.888344 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.888787 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.388768718 +0000 UTC m=+145.137998255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:45 crc kubenswrapper[4882]: I1002 16:19:45.990716 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:45 crc kubenswrapper[4882]: E1002 16:19:45.991514 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.49146473 +0000 UTC m=+145.240694257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:45.998180 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:45.998648 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.498631505 +0000 UTC m=+145.247861032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.100264 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.100522 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.600477685 +0000 UTC m=+145.349707212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.100974 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.101433 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.601420871 +0000 UTC m=+145.350650398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.201881 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.202158 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.702116309 +0000 UTC m=+145.451345846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.202244 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.202807 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.702788187 +0000 UTC m=+145.452017724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.303970 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.304538 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.804462381 +0000 UTC m=+145.553692018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.406259 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:46.906241169 +0000 UTC m=+145.655470706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.405783 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.507908 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.508087 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.008052547 +0000 UTC m=+145.757282084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.508698 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.509141 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.009129546 +0000 UTC m=+145.758359083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.609542 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.609698 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.10967422 +0000 UTC m=+145.858903747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.609794 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.610170 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.110162183 +0000 UTC m=+145.859391710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.710712 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.710926 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.210885392 +0000 UTC m=+145.960114929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.711144 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.711673 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.211654633 +0000 UTC m=+145.960884180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.813627 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.813868 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.313825181 +0000 UTC m=+146.063054708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.814163 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.814575 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.314559461 +0000 UTC m=+146.063788988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.830460 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.830829 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.833414 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zjj8d" event={"ID":"8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6","Type":"ContainerStarted","Data":"f3a0465dcf9496d3a241ff974369034f81a471cee7204e5f15a5f83ee8b56639"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.836473 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" event={"ID":"18a563ac-8237-49c0-a830-23ad0150d1cb","Type":"ContainerStarted","Data":"dd26d5c9f9cdb8d8831a9e1abb8b7df068ccb1a13377f51e53d7f24832871d2a"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.837896 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" event={"ID":"f855380f-e75e-4573-b8b1-b4d493210d4f","Type":"ContainerStarted","Data":"8844d56e150b3e055e61efac9b9f3368992c630ea1e35ce6d2667573bdf2a1d2"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.839276 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" event={"ID":"5ec03001-e8a9-4e0f-a20a-ad45cb0542c6","Type":"ContainerStarted","Data":"9c710e5f952fa136b191cc457e93e06c4979410733fe5e8e26c3d8624f158e48"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.843165 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" event={"ID":"7886091e-37f9-44f7-b04d-a210a82a62a8","Type":"ContainerStarted","Data":"e07f28331c7e355050aca27f070b1c0ca0ce3bf38f0950e95e1af503fff51d04"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.848146 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" event={"ID":"13689e84-053e-4f97-8689-6a4c14800153","Type":"ContainerStarted","Data":"122a81a0b994b79caa9b245b3664ff86c182609fd2d8df2ad0b1884238ab91c7"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.850600 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" event={"ID":"1ed1366e-852f-4e88-ae80-6fc761781c31","Type":"ContainerStarted","Data":"27681bc9ec997ee43896062abc11423c7492852b616069d004c94986b1b14434"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.858565 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" event={"ID":"e34155fb-8fe4-480a-aa87-518c3e344cab","Type":"ContainerStarted","Data":"12669076d58630cc43f85c4e03b04c77e3e0c27ffc0caec53e10b0db2d77daf8"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.868086 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6n7gp" event={"ID":"41edc773-9f85-408f-9605-a86000b41aa2","Type":"ContainerStarted","Data":"58f5334bb3aa0cffb69232309dadcc26dce14ed632f1e1518644ee10a8e70896"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.873992 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" event={"ID":"a0bfe57e-f325-4e0f-910b-0f402074eb76","Type":"ContainerStarted","Data":"c506ddf4c3360d29d3fa95f7bb7ca143aa94a94762b7a92e842ea48fb1534e9d"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.875311 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" event={"ID":"3c101359-1d67-4282-bd53-04fb1d0a3977","Type":"ContainerStarted","Data":"ae7ad56499e05c5bd5dba5ec956ffd5946c5a5430b7e2c371ca12e4445068e19"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.879693 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" event={"ID":"09a298da-0301-49c1-b755-1b5ec0058b3e","Type":"ContainerStarted","Data":"e2f331207146d606589ed12da56034dfa2112f1356b72a44d2d0f98620579bb1"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.884092 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" event={"ID":"e8f1c724-37a2-4524-8e7c-21fb617b124a","Type":"ContainerStarted","Data":"de9b769549fc3a53292deb7f35669feb8ae7920d9c3440b1af2926c45ed546f9"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.892705 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wpc8" event={"ID":"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae","Type":"ContainerStarted","Data":"74935af75d95f1a7c7dec382d4c8d0a64b51da167ea1560f5b0a00b6df01d3c6"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.896235 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" event={"ID":"d41af342-d652-4f68-adb6-563afb9ca544","Type":"ContainerStarted","Data":"b4e3ce8e8cd91697453465397a148af877d461572de106f819253bf0283de4c7"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.902828 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" event={"ID":"0a678431-0e28-410a-867d-d2dcb8dcfe36","Type":"ContainerStarted","Data":"c327a65bc10622724801569f347222ecc9da2faab3b592ee0e95fd37aa161ff5"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.904599 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" event={"ID":"7b2cfab9-1657-44b4-b8d9-2bced2997338","Type":"ContainerStarted","Data":"a9bfcdc2698d65d3bed3c5a5ab7151ec77fedcf1ffd15654e15d7e5a04c5f65d"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.907311 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" podStartSLOduration=123.907294901 podStartE2EDuration="2m3.907294901s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:46.90356532 +0000 UTC m=+145.652794857" watchObservedRunningTime="2025-10-02 16:19:46.907294901 +0000 UTC m=+145.656524428" Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.908731 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" event={"ID":"414ef5a5-d3ca-4879-a201-9b3f8f340740","Type":"ContainerStarted","Data":"4b8f21f7db4da02e84d7f490962655c7f285491298b19095e82a85198fa9e030"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.913152 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" event={"ID":"e3e5c6ee-e13e-40e9-8c61-b0ac10d62323","Type":"ContainerStarted","Data":"4d87f603a2646855c9320f9bc5336b2d546cff21111e89ea31484d74488d5933"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.915787 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.915912 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.415887177 +0000 UTC m=+146.165116704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.916163 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:46 crc kubenswrapper[4882]: E1002 16:19:46.916517 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.416509413 +0000 UTC m=+146.165738940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.932533 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" event={"ID":"c465e72e-8499-4dce-ab77-064f7ecb1c81","Type":"ContainerStarted","Data":"6b4af9e186bd5c62e9fbe42f2f2df90e91f51e7568a712865aa59b17231c77a6"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.946364 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtxmh" podStartSLOduration=123.946337697 podStartE2EDuration="2m3.946337697s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:46.9402263 +0000 UTC m=+145.689455827" watchObservedRunningTime="2025-10-02 16:19:46.946337697 +0000 UTC m=+145.695567244" Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.947541 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" event={"ID":"d8995298-5850-4a91-b913-d9b134818b45","Type":"ContainerStarted","Data":"bee39d29cb5c78b521d05948f13374ccb228097489251eaf2aa3668ce889d66c"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.956167 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" event={"ID":"dba2a584-8773-444f-a47f-d29bec61b7f1","Type":"ContainerStarted","Data":"3e8d6dabbc163b4c463a56d4f88a08dca0181cfd05d7de96caa1f2800b53d0de"} Oct 02 16:19:46 crc kubenswrapper[4882]: I1002 16:19:46.980029 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" podStartSLOduration=123.980004466 podStartE2EDuration="2m3.980004466s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:46.977099176 +0000 UTC m=+145.726328703" watchObservedRunningTime="2025-10-02 16:19:46.980004466 +0000 UTC m=+145.729233993" Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.002093 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" event={"ID":"0ec64748-14d0-4078-a51e-deee8610c82f","Type":"ContainerStarted","Data":"f6acade73dba52709ba5797d608185108495503b75b1c0dcd11353f0d595f228"} Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.007121 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" event={"ID":"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4","Type":"ContainerStarted","Data":"966b60d35a943039bb770cb089db418a386a76745fb376fb9d0fd7411727bfbc"} Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.010411 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" event={"ID":"53105ebf-8ac0-401a-8c49-b6c4780082e5","Type":"ContainerStarted","Data":"8a94541c4895bb811e8ea4d5676e28d771455e87c3a131339d45073381f072be"} Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.013293 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" event={"ID":"ed430b79-ad1a-456d-be2f-6cb51f2564dc","Type":"ContainerStarted","Data":"ddff148e1f049e1a5a589c1d1ef5bc9d4b2735474cc417f6aa02b86cae1ea523"} Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.017158 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.017734 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.517716175 +0000 UTC m=+146.266945702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.019671 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" event={"ID":"a9c8cbc2-84d2-44b7-a9f7-cd38c477d0a3","Type":"ContainerStarted","Data":"9d2a8ccdd98ac85c313d6bef245e06319dd45955b48a15b080100512076404a7"} Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.020759 4882 patch_prober.go:28] interesting pod/console-operator-58897d9998-mw2kv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.020853 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" podUID="cb409fa4-c522-4729-9c26-11d24aab19e3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.021089 4882 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wcjdw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.021169 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" podUID="92e16305-ea70-49fd-b269-0e36792ee6ea" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.021771 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hmk79" podStartSLOduration=7.021750105 podStartE2EDuration="7.021750105s" podCreationTimestamp="2025-10-02 16:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:47.016266185 +0000 UTC m=+145.765495732" watchObservedRunningTime="2025-10-02 16:19:47.021750105 +0000 UTC m=+145.770979632" Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.065650 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbzd" podStartSLOduration=124.065616162 podStartE2EDuration="2m4.065616162s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:47.060186324 +0000 UTC m=+145.809415851" watchObservedRunningTime="2025-10-02 16:19:47.065616162 +0000 UTC m=+145.814845689" Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.119351 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.123896 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.623873662 +0000 UTC m=+146.373103409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.222616 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.223025 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.723008027 +0000 UTC m=+146.472237554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.324767 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.325279 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.825257428 +0000 UTC m=+146.574487035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.426008 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.426366 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.926336606 +0000 UTC m=+146.675566133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.426680 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.427085 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:47.927072056 +0000 UTC m=+146.676301583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.527488 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.527772 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.027748554 +0000 UTC m=+146.776978081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.528149 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.528750 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.02873948 +0000 UTC m=+146.777969007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.629006 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.629697 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.129670094 +0000 UTC m=+146.878899621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.731394 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.731929 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.231906284 +0000 UTC m=+146.981135811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.833457 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.833608 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.333582129 +0000 UTC m=+147.082811666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.834012 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.834406 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.334394142 +0000 UTC m=+147.083623669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.835389 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:47 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:47 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:47 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.835430 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:47 crc kubenswrapper[4882]: I1002 16:19:47.935321 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:47 crc kubenswrapper[4882]: E1002 16:19:47.935886 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.43585841 +0000 UTC m=+147.185087937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.025828 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" event={"ID":"3c101359-1d67-4282-bd53-04fb1d0a3977","Type":"ContainerStarted","Data":"fd9bb25fd72c85c5a526924b0de39495ee88e4c55ed9356efd25fa537712f29a"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.029499 4882 generic.go:334] "Generic (PLEG): container finished" podID="12d9299c-b3ee-40b9-a2d6-56159ba9ff66" containerID="c81298c7fc649f551191b6ca9130ab59432a74b3788813ede218b6fe0be01b7b" exitCode=0 Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.029600 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" event={"ID":"12d9299c-b3ee-40b9-a2d6-56159ba9ff66","Type":"ContainerDied","Data":"c81298c7fc649f551191b6ca9130ab59432a74b3788813ede218b6fe0be01b7b"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.031630 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" event={"ID":"e34155fb-8fe4-480a-aa87-518c3e344cab","Type":"ContainerStarted","Data":"b576b0c0df0a3ee2ff7c31aef7e21bd54cfa2254c9c1c29e3a5f95fb1ad20380"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.033092 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" event={"ID":"4106b164-7d6a-4837-840b-f5a068e1aec9","Type":"ContainerStarted","Data":"572d265a540efc6cdfe85c381b1212b0081a6f6f8a1e5cb8f2fcfad2d80ccd8c"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.034809 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" event={"ID":"7b2cfab9-1657-44b4-b8d9-2bced2997338","Type":"ContainerStarted","Data":"a9e4fec535c0632316a77005d53a25bc444c1097f4b0def36a7c6d37addff40e"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.036741 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" event={"ID":"a0bfe57e-f325-4e0f-910b-0f402074eb76","Type":"ContainerStarted","Data":"200226088e4eb6d9e62ef63a6e77890c5e93c2ee8c9dbe88d045bbf565d72e10"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.036996 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.037460 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.537439052 +0000 UTC m=+147.286668589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.038200 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zjj8d" event={"ID":"8a50d28b-2b5e-4ca7-9547-14d5f41e2bd6","Type":"ContainerStarted","Data":"9733f812aba20ee01990971586b8f47112e77213a27f5588054386aa8ac600d7"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.039181 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" event={"ID":"f855380f-e75e-4573-b8b1-b4d493210d4f","Type":"ContainerStarted","Data":"9334f1fe8bb5b3c516595d56be0959f5b39cf77a0bc9cbe0be3c7268c1c7c838"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.041375 4882 generic.go:334] "Generic (PLEG): container finished" podID="0ec64748-14d0-4078-a51e-deee8610c82f" containerID="f6acade73dba52709ba5797d608185108495503b75b1c0dcd11353f0d595f228" exitCode=0 Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.041478 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" event={"ID":"0ec64748-14d0-4078-a51e-deee8610c82f","Type":"ContainerDied","Data":"f6acade73dba52709ba5797d608185108495503b75b1c0dcd11353f0d595f228"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.043249 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" event={"ID":"a182d0c0-45a9-450b-affc-44caf339abd8","Type":"ContainerStarted","Data":"7b07e2c779a6fbf4f475ce9cf1c343e6032cfd7a83cdca28a3cd301b468c1782"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.043400 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.044555 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" event={"ID":"f0133093-ff9a-4741-abf2-746361b98451","Type":"ContainerStarted","Data":"59757924ac9f529ffe8c40ac4650528a2184342b802256cd9245defb2eee5ef1"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.045876 4882 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mfzth container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.045938 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" podUID="a182d0c0-45a9-450b-affc-44caf339abd8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.045962 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" event={"ID":"d8995298-5850-4a91-b913-d9b134818b45","Type":"ContainerStarted","Data":"8191a455bc7e7faacd943380d0e7a7a28b5c93f5422ddc51ddc60582764f17b3"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.047374 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" event={"ID":"0a678431-0e28-410a-867d-d2dcb8dcfe36","Type":"ContainerStarted","Data":"8e3f492627ec01574cf9a49e9fbb025aa5871c9507c2fc099e6f2ae830def2cf"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.048651 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" event={"ID":"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f","Type":"ContainerStarted","Data":"a45588ca405ead2bc095135f0f6b0f3444228a97b15a986680662512149858e6"} Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.084867 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" podStartSLOduration=124.084837455 podStartE2EDuration="2m4.084837455s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.081472704 +0000 UTC m=+146.830702231" watchObservedRunningTime="2025-10-02 16:19:48.084837455 +0000 UTC m=+146.834066982" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.128353 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7n6gf" podStartSLOduration=125.128334832 podStartE2EDuration="2m5.128334832s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.127477399 +0000 UTC m=+146.876706916" watchObservedRunningTime="2025-10-02 16:19:48.128334832 +0000 UTC m=+146.877564359" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.130339 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" podStartSLOduration=124.130333197 podStartE2EDuration="2m4.130333197s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.104909283 +0000 UTC m=+146.854138810" watchObservedRunningTime="2025-10-02 16:19:48.130333197 +0000 UTC m=+146.879562724" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.137713 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.137914 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.637883643 +0000 UTC m=+147.387113170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.138417 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.139492 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.639477686 +0000 UTC m=+147.388707433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.155078 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zjj8d" podStartSLOduration=7.155052912 podStartE2EDuration="7.155052912s" podCreationTimestamp="2025-10-02 16:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.154535927 +0000 UTC m=+146.903765474" watchObservedRunningTime="2025-10-02 16:19:48.155052912 +0000 UTC m=+146.904282439" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.188559 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" podStartSLOduration=124.188513914 podStartE2EDuration="2m4.188513914s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.18320076 +0000 UTC m=+146.932430297" watchObservedRunningTime="2025-10-02 16:19:48.188513914 +0000 UTC m=+146.937743441" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.250987 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.254798 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.754772913 +0000 UTC m=+147.504002440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.259534 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t52zc" podStartSLOduration=125.259510362 podStartE2EDuration="2m5.259510362s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.249270793 +0000 UTC m=+146.998500320" watchObservedRunningTime="2025-10-02 16:19:48.259510362 +0000 UTC m=+147.008739889" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.262882 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.264270 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.764252182 +0000 UTC m=+147.513481709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.285132 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ztdvl" podStartSLOduration=125.28510374 podStartE2EDuration="2m5.28510374s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.284885374 +0000 UTC m=+147.034114911" watchObservedRunningTime="2025-10-02 16:19:48.28510374 +0000 UTC m=+147.034333267" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.325128 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6n7gp" podStartSLOduration=125.325108962 podStartE2EDuration="2m5.325108962s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.323881849 +0000 UTC m=+147.073111376" watchObservedRunningTime="2025-10-02 16:19:48.325108962 +0000 UTC m=+147.074338489" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.370137 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.370790 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.870770899 +0000 UTC m=+147.620000426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.383967 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" podStartSLOduration=125.383939047 podStartE2EDuration="2m5.383939047s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:48.380410822 +0000 UTC m=+147.129640349" watchObservedRunningTime="2025-10-02 16:19:48.383939047 +0000 UTC m=+147.133168574" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.473087 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.473592 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:48.973578454 +0000 UTC m=+147.722807981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.574070 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.574672 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.074652622 +0000 UTC m=+147.823882149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.574890 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.575241 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.075234238 +0000 UTC m=+147.824463765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.676647 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.677168 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.177145698 +0000 UTC m=+147.926375225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.781488 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.782542 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.282524075 +0000 UTC m=+148.031753602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.849132 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:48 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:48 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:48 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.849288 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.883549 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.884081 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.384058765 +0000 UTC m=+148.133288292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:48 crc kubenswrapper[4882]: I1002 16:19:48.985307 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:48 crc kubenswrapper[4882]: E1002 16:19:48.985777 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.48575621 +0000 UTC m=+148.234985737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.055847 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" event={"ID":"f855380f-e75e-4573-b8b1-b4d493210d4f","Type":"ContainerStarted","Data":"a12ba896c9c7058b44508f419a8bbb6e640f8842709a6a8cfa20ff1d2d8432c9"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.057253 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.057969 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" event={"ID":"13689e84-053e-4f97-8689-6a4c14800153","Type":"ContainerStarted","Data":"88649523910b3d929fe9f53646257a8f7bcd7752f999855e1b027b3f9a732508"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.059409 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" event={"ID":"09a298da-0301-49c1-b755-1b5ec0058b3e","Type":"ContainerStarted","Data":"bc3e71c335e04a0c5d40da1bf93ee402ff878dad8c8685ad523eb5efa3af4452"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.061937 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" event={"ID":"ff38de28-245a-4acd-b148-e2b71a457eff","Type":"ContainerStarted","Data":"01ce926d2d65f196554c50356ba8ff2e7f41b3a45255a7341cb62e7f56ebe456"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.063947 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wpc8" event={"ID":"e1f9c3ef-fd93-4666-a1b9-42c3ee8a87ae","Type":"ContainerStarted","Data":"8ca4497e8073977672e57a97dee791a1cd8efb8ca015eaa60e5caebae991cdd9"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.066169 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" event={"ID":"ed430b79-ad1a-456d-be2f-6cb51f2564dc","Type":"ContainerStarted","Data":"7ed2dba99aeb55ec5d04be77649e52f118aec1fb12bb3fb3a9ef021727545ba0"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.068241 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" event={"ID":"4106b164-7d6a-4837-840b-f5a068e1aec9","Type":"ContainerStarted","Data":"0b5d4e4cb0e41bb1d04743c7b02b52c77aab44127b499293a779b663f4ed5edb"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.070312 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" event={"ID":"0a678431-0e28-410a-867d-d2dcb8dcfe36","Type":"ContainerStarted","Data":"14bf77304d463d77d00cec4df6286c1025442c75518a660ad96e56396215d838"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.072307 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" event={"ID":"1ed1366e-852f-4e88-ae80-6fc761781c31","Type":"ContainerStarted","Data":"e5ea42a9bdd2f69f99de662a111ab4be48d475330682962db04e6b0723c76d34"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.072383 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" event={"ID":"1ed1366e-852f-4e88-ae80-6fc761781c31","Type":"ContainerStarted","Data":"abed7caff64b2781699d7c7ed7b65150d1693611720253f3762fc1e5d493f145"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.073907 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" event={"ID":"1c25a6f2-5b95-4d01-b7a1-b7fa2ba788d7","Type":"ContainerStarted","Data":"8eb522c43bfb3a3b318e499cac6a741f5ca38785049e8587be8bc1ec4c2a84f8"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.075475 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" event={"ID":"414ef5a5-d3ca-4879-a201-9b3f8f340740","Type":"ContainerStarted","Data":"9609d2ba14b5cfc8c73f3e5c8c9ede4f6b5642cac4aa68c1bc40002b24d15c23"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.075789 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.077574 4882 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c4s5w container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.077630 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" podUID="414ef5a5-d3ca-4879-a201-9b3f8f340740" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.077888 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" event={"ID":"3c101359-1d67-4282-bd53-04fb1d0a3977","Type":"ContainerStarted","Data":"129dc1fea676f493b091d8b257b8a12127b8bf7e1c5a256ea44d4dca8aa61e5a"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.080016 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" event={"ID":"e34155fb-8fe4-480a-aa87-518c3e344cab","Type":"ContainerStarted","Data":"3d0eabd139ba91dd741bd07e1f8deba00494b290d4c95275667875083d7b1b84"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.081489 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" event={"ID":"d41af342-d652-4f68-adb6-563afb9ca544","Type":"ContainerStarted","Data":"535a9ae77b75c2579445bbf9e4256ab68af2412c67d8c94426274e9d7b10d85a"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.084071 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" event={"ID":"d8995298-5850-4a91-b913-d9b134818b45","Type":"ContainerStarted","Data":"b74c6ae9690375d26d1957f5060f72053731b4bb4e678507c60c2d4ef82442e5"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.086255 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.086386 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.586350006 +0000 UTC m=+148.335579533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.086457 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" event={"ID":"0ec64748-14d0-4078-a51e-deee8610c82f","Type":"ContainerStarted","Data":"294ff52c70b0e09f44c1060ddf399ea63794e09abbbcb4a87bba15dc0c14911f"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.088109 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.089482 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.58946131 +0000 UTC m=+148.338690837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.090492 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" event={"ID":"53105ebf-8ac0-401a-8c49-b6c4780082e5","Type":"ContainerStarted","Data":"2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.090818 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.098631 4882 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-d972j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.099013 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" podUID="53105ebf-8ac0-401a-8c49-b6c4780082e5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.101601 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" event={"ID":"12d9299c-b3ee-40b9-a2d6-56159ba9ff66","Type":"ContainerStarted","Data":"83a418751300e6b10953bc2bb1b595fd1b52e2fd023a4fd523f7ac462db758d1"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.101722 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.103070 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" event={"ID":"18a563ac-8237-49c0-a830-23ad0150d1cb","Type":"ContainerStarted","Data":"e6f3e29296a178fb746093927f82abc0994ab0eeee620265311e79c4333c148e"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.105744 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" event={"ID":"5ec03001-e8a9-4e0f-a20a-ad45cb0542c6","Type":"ContainerStarted","Data":"7413e6d0f4197199a639b3312add3a04c07846f28196ad390e7fc96f349c6f0f"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.107851 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" event={"ID":"a0bfe57e-f325-4e0f-910b-0f402074eb76","Type":"ContainerStarted","Data":"6000976733fe29c547fbd71f2970c479da1d75fc5d531d6735f5b85e2975af63"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.112158 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" event={"ID":"dba2a584-8773-444f-a47f-d29bec61b7f1","Type":"ContainerStarted","Data":"a735dea82ba0702a3a0f13ad1079f1489e61cab3b43be51d4e479326afb502ee"} Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.189696 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.190806 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.690783545 +0000 UTC m=+148.440013072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.216388 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.221356 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xvg9j" podStartSLOduration=125.221322359 podStartE2EDuration="2m5.221322359s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.220597519 +0000 UTC m=+147.969827036" watchObservedRunningTime="2025-10-02 16:19:49.221322359 +0000 UTC m=+147.970551886" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.221704 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" podStartSLOduration=125.221700419 podStartE2EDuration="2m5.221700419s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.153106027 +0000 UTC m=+147.902335554" watchObservedRunningTime="2025-10-02 16:19:49.221700419 +0000 UTC m=+147.970929946" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.320561 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.320985 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.820963247 +0000 UTC m=+148.570192774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.344660 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qhq56" podStartSLOduration=125.344635883 podStartE2EDuration="2m5.344635883s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.307079628 +0000 UTC m=+148.056309155" watchObservedRunningTime="2025-10-02 16:19:49.344635883 +0000 UTC m=+148.093865410" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.374780 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9t27z" podStartSLOduration=125.374758116 podStartE2EDuration="2m5.374758116s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.347718667 +0000 UTC m=+148.096948194" watchObservedRunningTime="2025-10-02 16:19:49.374758116 +0000 UTC m=+148.123987643" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.375525 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lgptr" podStartSLOduration=125.375520476 podStartE2EDuration="2m5.375520476s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.371712043 +0000 UTC m=+148.120941570" watchObservedRunningTime="2025-10-02 16:19:49.375520476 +0000 UTC m=+148.124750003" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.422550 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.422802 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.922736225 +0000 UTC m=+148.671965752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.423164 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.423663 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:49.92364523 +0000 UTC m=+148.672874947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.471832 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vczsf" podStartSLOduration=125.471808084 podStartE2EDuration="2m5.471808084s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.407586681 +0000 UTC m=+148.156816208" watchObservedRunningTime="2025-10-02 16:19:49.471808084 +0000 UTC m=+148.221037611" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.472053 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" podStartSLOduration=125.4720477 podStartE2EDuration="2m5.4720477s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.46800595 +0000 UTC m=+148.217235477" watchObservedRunningTime="2025-10-02 16:19:49.4720477 +0000 UTC m=+148.221277227" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.502715 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x8fdq" podStartSLOduration=125.502696697 podStartE2EDuration="2m5.502696697s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.500069655 +0000 UTC m=+148.249299182" watchObservedRunningTime="2025-10-02 16:19:49.502696697 +0000 UTC m=+148.251926224" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.525151 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.525425 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.025387756 +0000 UTC m=+148.774617293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.525615 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.526067 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.026048614 +0000 UTC m=+148.775278141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.541839 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6wpc8" podStartSLOduration=9.541812764 podStartE2EDuration="9.541812764s" podCreationTimestamp="2025-10-02 16:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.54091514 +0000 UTC m=+148.290144667" watchObservedRunningTime="2025-10-02 16:19:49.541812764 +0000 UTC m=+148.291042291" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.580981 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nwqqf" podStartSLOduration=126.580954783 podStartE2EDuration="2m6.580954783s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.579269177 +0000 UTC m=+148.328498704" watchObservedRunningTime="2025-10-02 16:19:49.580954783 +0000 UTC m=+148.330184310" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.620071 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbw9p" podStartSLOduration=125.620048459 podStartE2EDuration="2m5.620048459s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.616926034 +0000 UTC m=+148.366155561" watchObservedRunningTime="2025-10-02 16:19:49.620048459 +0000 UTC m=+148.369277976" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.626868 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.627090 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.1270442 +0000 UTC m=+148.876273727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.627225 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.627263 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.627350 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.627704 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.127687227 +0000 UTC m=+148.876916754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.634721 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.664956 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.684524 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.703166 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7cgt8" podStartSLOduration=125.703143347 podStartE2EDuration="2m5.703143347s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.640604881 +0000 UTC m=+148.389834408" watchObservedRunningTime="2025-10-02 16:19:49.703143347 +0000 UTC m=+148.452372874" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.728889 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.729196 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.729279 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.731139 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.23111649 +0000 UTC m=+148.980346017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.741120 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.743445 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.766903 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.785028 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" podStartSLOduration=126.784999711 podStartE2EDuration="2m6.784999711s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.705178893 +0000 UTC m=+148.454408420" watchObservedRunningTime="2025-10-02 16:19:49.784999711 +0000 UTC m=+148.534229238" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.785343 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nrq6m" podStartSLOduration=125.78533815 podStartE2EDuration="2m5.78533815s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.749734409 +0000 UTC m=+148.498963926" watchObservedRunningTime="2025-10-02 16:19:49.78533815 +0000 UTC m=+148.534567677" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.808713 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9snj8" podStartSLOduration=125.808679207 podStartE2EDuration="2m5.808679207s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.806166898 +0000 UTC m=+148.555396425" watchObservedRunningTime="2025-10-02 16:19:49.808679207 +0000 UTC m=+148.557908734" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.831241 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.831634 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.331621363 +0000 UTC m=+149.080850880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.835541 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:49 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:49 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:49 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.835605 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.891927 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-db4xr" podStartSLOduration=125.891903998 podStartE2EDuration="2m5.891903998s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.858837705 +0000 UTC m=+148.608067242" watchObservedRunningTime="2025-10-02 16:19:49.891903998 +0000 UTC m=+148.641133525" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.894095 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmqdb" podStartSLOduration=125.894087538 podStartE2EDuration="2m5.894087538s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.890479309 +0000 UTC m=+148.639708836" watchObservedRunningTime="2025-10-02 16:19:49.894087538 +0000 UTC m=+148.643317065" Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.933755 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:49 crc kubenswrapper[4882]: E1002 16:19:49.934166 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.434149081 +0000 UTC m=+149.183378608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:49 crc kubenswrapper[4882]: I1002 16:19:49.975511 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.004524 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.029473 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" podStartSLOduration=126.029455371 podStartE2EDuration="2m6.029455371s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:49.953384676 +0000 UTC m=+148.702614283" watchObservedRunningTime="2025-10-02 16:19:50.029455371 +0000 UTC m=+148.778684898" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.034987 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.035403 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.535388034 +0000 UTC m=+149.284617561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.067321 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zc5l4" podStartSLOduration=126.067295934 podStartE2EDuration="2m6.067295934s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:50.028149865 +0000 UTC m=+148.777379392" watchObservedRunningTime="2025-10-02 16:19:50.067295934 +0000 UTC m=+148.816525461" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.070124 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-z2v5c" podStartSLOduration=126.070114371 podStartE2EDuration="2m6.070114371s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:50.069364611 +0000 UTC m=+148.818594138" watchObservedRunningTime="2025-10-02 16:19:50.070114371 +0000 UTC m=+148.819343888" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.138160 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.138498 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.638477516 +0000 UTC m=+149.387707043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.158402 4882 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c4s5w container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.158463 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" podUID="414ef5a5-d3ca-4879-a201-9b3f8f340740" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.167366 4882 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-d972j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.167431 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" podUID="53105ebf-8ac0-401a-8c49-b6c4780082e5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.240660 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.247251 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.747226024 +0000 UTC m=+149.496455551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.253132 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqvp9" podStartSLOduration=126.253110175 podStartE2EDuration="2m6.253110175s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:50.239071342 +0000 UTC m=+148.988300869" watchObservedRunningTime="2025-10-02 16:19:50.253110175 +0000 UTC m=+149.002339702" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.255260 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" podStartSLOduration=126.255252904 podStartE2EDuration="2m6.255252904s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:50.12468521 +0000 UTC m=+148.873914737" watchObservedRunningTime="2025-10-02 16:19:50.255252904 +0000 UTC m=+149.004482421" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.341142 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" podStartSLOduration=127.341113966 podStartE2EDuration="2m7.341113966s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:50.316051952 +0000 UTC m=+149.065281479" watchObservedRunningTime="2025-10-02 16:19:50.341113966 +0000 UTC m=+149.090343493" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.344659 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.345198 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.845175037 +0000 UTC m=+149.594404564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.457188 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.457592 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:50.957577664 +0000 UTC m=+149.706807191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.558113 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.559001 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.058977561 +0000 UTC m=+149.808207088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.661472 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.661963 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.161943591 +0000 UTC m=+149.911173118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: W1002 16:19:50.747742 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-10b1d48f0a613443796a03ad06a96ec9a6944d768a9718c9dd73f987135df53e WatchSource:0}: Error finding container 10b1d48f0a613443796a03ad06a96ec9a6944d768a9718c9dd73f987135df53e: Status 404 returned error can't find the container with id 10b1d48f0a613443796a03ad06a96ec9a6944d768a9718c9dd73f987135df53e Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.763844 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.764341 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.264323185 +0000 UTC m=+150.013552712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.846500 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:50 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:50 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:50 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.846564 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.867002 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.867392 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.367378308 +0000 UTC m=+150.116607835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:50 crc kubenswrapper[4882]: W1002 16:19:50.902375 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-e2987c16c493869a0482b3c5216e9c8f8da5f7467143d48bd54b558b756d56be WatchSource:0}: Error finding container e2987c16c493869a0482b3c5216e9c8f8da5f7467143d48bd54b558b756d56be: Status 404 returned error can't find the container with id e2987c16c493869a0482b3c5216e9c8f8da5f7467143d48bd54b558b756d56be Oct 02 16:19:50 crc kubenswrapper[4882]: I1002 16:19:50.970010 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:50 crc kubenswrapper[4882]: E1002 16:19:50.970433 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.470410919 +0000 UTC m=+150.219640436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.071168 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.071621 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.571606721 +0000 UTC m=+150.320836248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.161413 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3b1ac088a0ae01170086662e6fdda741df07881ac9e6fadca050d43e2e8ed12d"} Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.161822 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9d592023c0ddae1588f5a3b8bd2ae4d42d80ce30d7683a328a1f82e4ca7566f5"} Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.164126 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" event={"ID":"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4","Type":"ContainerStarted","Data":"7f3f6a6403de7044a13b8ab4060c168637398cf875a38522192493a0c4f1594d"} Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.165689 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4bc1812114461a3778daee075ff1d27dbe895b2e46d41e9376ab9ab1a34b586f"} Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.165759 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"10b1d48f0a613443796a03ad06a96ec9a6944d768a9718c9dd73f987135df53e"} Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.165968 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.168321 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e2987c16c493869a0482b3c5216e9c8f8da5f7467143d48bd54b558b756d56be"} Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.172532 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.172735 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.672704729 +0000 UTC m=+150.421934256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.172799 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.173175 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.673156531 +0000 UTC m=+150.422386058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.273905 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.274111 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.774074005 +0000 UTC m=+150.523303532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.274385 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.274936 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.774913819 +0000 UTC m=+150.524143346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.375522 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.375881 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.875864613 +0000 UTC m=+150.625094130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.477063 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.477635 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:51.97761266 +0000 UTC m=+150.726842367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.578339 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.578520 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.078486703 +0000 UTC m=+150.827716230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.578632 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.579062 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.079054458 +0000 UTC m=+150.828283975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.679763 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.680236 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.180203528 +0000 UTC m=+150.929433055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.782156 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.782691 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.282668984 +0000 UTC m=+151.031898501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.835172 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:51 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:51 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:51 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.835289 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.855038 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.883529 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.883993 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.383974659 +0000 UTC m=+151.133204176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.889557 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.889963 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.910710 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.912850 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.912945 4882 patch_prober.go:28] interesting pod/console-f9d7485db-9f6pj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.912981 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9f6pj" podUID="c5af616c-8948-402c-97b8-3aadd17673d2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.970072 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x8p45"] Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.971409 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.975265 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 16:19:51 crc kubenswrapper[4882]: I1002 16:19:51.985498 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:51 crc kubenswrapper[4882]: E1002 16:19:51.987397 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.487378101 +0000 UTC m=+151.236607618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.086251 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x8p45"] Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.086798 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.086941 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.586922077 +0000 UTC m=+151.336151604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.087092 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-utilities\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.087122 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgjgn\" (UniqueName: \"kubernetes.io/projected/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-kube-api-access-tgjgn\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.087179 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-catalog-content\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.087239 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.087569 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.587558784 +0000 UTC m=+151.336788311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.131816 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dn4hr"] Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.133149 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.135857 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.178308 4882 generic.go:334] "Generic (PLEG): container finished" podID="e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f" containerID="a45588ca405ead2bc095135f0f6b0f3444228a97b15a986680662512149858e6" exitCode=0 Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.178377 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" event={"ID":"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f","Type":"ContainerDied","Data":"a45588ca405ead2bc095135f0f6b0f3444228a97b15a986680662512149858e6"} Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.180524 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ba2d7f6b0be6e091a6c65368be688994cc66a736252be74ebb531493473e8942"} Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.187925 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.188272 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.688255652 +0000 UTC m=+151.437485179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.188344 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-utilities\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.188416 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-utilities\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.188490 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgjgn\" (UniqueName: \"kubernetes.io/projected/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-kube-api-access-tgjgn\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.188521 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-catalog-content\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.189044 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-catalog-content\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.189095 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.189135 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzwz\" (UniqueName: \"kubernetes.io/projected/9712373b-49f5-4e5e-9309-2031e7a680fa-kube-api-access-mgzwz\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.189455 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.689446675 +0000 UTC m=+151.438676202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.189652 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-utilities\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.189748 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-catalog-content\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.251061 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgjgn\" (UniqueName: \"kubernetes.io/projected/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-kube-api-access-tgjgn\") pod \"certified-operators-x8p45\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.254269 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dn4hr"] Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.288746 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.289628 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.290055 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.79003235 +0000 UTC m=+151.539261877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.290098 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-catalog-content\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.290133 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.290183 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgzwz\" (UniqueName: \"kubernetes.io/projected/9712373b-49f5-4e5e-9309-2031e7a680fa-kube-api-access-mgzwz\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.290251 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-utilities\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.291306 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-catalog-content\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.291638 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.791626943 +0000 UTC m=+151.540856470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.292577 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-utilities\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.325186 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgzwz\" (UniqueName: \"kubernetes.io/projected/9712373b-49f5-4e5e-9309-2031e7a680fa-kube-api-access-mgzwz\") pod \"community-operators-dn4hr\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.360731 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kvxtr"] Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.362197 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.383838 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kvxtr"] Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.392069 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.393812 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.893782981 +0000 UTC m=+151.643012508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.448993 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.496855 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-catalog-content\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.496903 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-utilities\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.496940 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z76tp\" (UniqueName: \"kubernetes.io/projected/250ffd8d-e281-4666-838c-2d351c64a6a6-kube-api-access-z76tp\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.497000 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.497451 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:52.997434509 +0000 UTC m=+151.746664036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.544364 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kkjb6"] Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.545788 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.564659 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kkjb6"] Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.599826 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.600005 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-catalog-content\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.600074 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-utilities\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.600104 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-catalog-content\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.600123 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxsn\" (UniqueName: \"kubernetes.io/projected/02587579-5fbd-4019-afdc-5fb1b713adbb-kube-api-access-lrxsn\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.600142 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-utilities\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.600175 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z76tp\" (UniqueName: \"kubernetes.io/projected/250ffd8d-e281-4666-838c-2d351c64a6a6-kube-api-access-z76tp\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.600425 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.10040842 +0000 UTC m=+151.849637947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.601204 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-catalog-content\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.601480 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-utilities\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.646518 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z76tp\" (UniqueName: \"kubernetes.io/projected/250ffd8d-e281-4666-838c-2d351c64a6a6-kube-api-access-z76tp\") pod \"certified-operators-kvxtr\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.687365 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.701143 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-utilities\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.701207 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxsn\" (UniqueName: \"kubernetes.io/projected/02587579-5fbd-4019-afdc-5fb1b713adbb-kube-api-access-lrxsn\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.701297 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-catalog-content\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.701341 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.701704 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.201688693 +0000 UTC m=+151.950918220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.701742 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-utilities\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.702020 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-catalog-content\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.737928 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxsn\" (UniqueName: \"kubernetes.io/projected/02587579-5fbd-4019-afdc-5fb1b713adbb-kube-api-access-lrxsn\") pod \"community-operators-kkjb6\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.802955 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.803408 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.303390428 +0000 UTC m=+152.052619955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.852501 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:52 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:52 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:52 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.852921 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.871770 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:19:52 crc kubenswrapper[4882]: I1002 16:19:52.908526 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:52 crc kubenswrapper[4882]: E1002 16:19:52.909008 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.40899156 +0000 UTC m=+152.158221087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.009831 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.010331 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.510311425 +0000 UTC m=+152.259540942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.114415 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.114825 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.614809817 +0000 UTC m=+152.364039344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.157988 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x8p45"] Oct 02 16:19:53 crc kubenswrapper[4882]: W1002 16:19:53.202020 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f73bba_eae3_4e14_8cc0_2d4fe743b0d3.slice/crio-c098735f392da036bbe8de813de8c781623839d5a66c53ca02115f601538e8d1 WatchSource:0}: Error finding container c098735f392da036bbe8de813de8c781623839d5a66c53ca02115f601538e8d1: Status 404 returned error can't find the container with id c098735f392da036bbe8de813de8c781623839d5a66c53ca02115f601538e8d1 Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.215543 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.215941 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.715923796 +0000 UTC m=+152.465153323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.277541 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.303820 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dn4hr"] Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.324763 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.325137 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.825121766 +0000 UTC m=+152.574351293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.425998 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.426565 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:53.926542854 +0000 UTC m=+152.675772381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.448415 4882 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-xhcsv container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.448730 4882 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-xhcsv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.448742 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" podUID="12d9299c-b3ee-40b9-a2d6-56159ba9ff66" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.448798 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" podUID="12d9299c-b3ee-40b9-a2d6-56159ba9ff66" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.448538 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6n7gp" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.449157 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kvxtr"] Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.451520 4882 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n7gp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.451590 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n7gp" podUID="41edc773-9f85-408f-9605-a86000b41aa2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.451653 4882 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n7gp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.451746 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6n7gp" podUID="41edc773-9f85-408f-9605-a86000b41aa2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.457072 4882 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n7gp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.457144 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n7gp" podUID="41edc773-9f85-408f-9605-a86000b41aa2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.473007 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mw2kv" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.476688 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:53 crc kubenswrapper[4882]: W1002 16:19:53.476994 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250ffd8d_e281_4666_838c_2d351c64a6a6.slice/crio-0c1ee458867c18848d8c3d1c33e14638b76717e9717a429f7e0146d3267fc233 WatchSource:0}: Error finding container 0c1ee458867c18848d8c3d1c33e14638b76717e9717a429f7e0146d3267fc233: Status 404 returned error can't find the container with id 0c1ee458867c18848d8c3d1c33e14638b76717e9717a429f7e0146d3267fc233 Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.477228 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.542117 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.552371 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.579153 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.079131928 +0000 UTC m=+152.828361455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.619432 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.636345 4882 patch_prober.go:28] interesting pod/apiserver-76f77b778f-snzmw container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]log ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]etcd ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/max-in-flight-filter ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 16:19:53 crc kubenswrapper[4882]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 16:19:53 crc kubenswrapper[4882]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/openshift.io-startinformers ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 16:19:53 crc kubenswrapper[4882]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 16:19:53 crc kubenswrapper[4882]: livez check failed Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.636413 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" podUID="ff38de28-245a-4acd-b148-e2b71a457eff" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.647163 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.647698 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.147674067 +0000 UTC m=+152.896903594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.648159 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mbjzj" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.656687 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kkjb6"] Oct 02 16:19:53 crc kubenswrapper[4882]: W1002 16:19:53.756474 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02587579_5fbd_4019_afdc_5fb1b713adbb.slice/crio-511c4d84abb71b6d026368a3596407aa016dad16cf1560d83246b14321e285e6 WatchSource:0}: Error finding container 511c4d84abb71b6d026368a3596407aa016dad16cf1560d83246b14321e285e6: Status 404 returned error can't find the container with id 511c4d84abb71b6d026368a3596407aa016dad16cf1560d83246b14321e285e6 Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.775992 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.776599 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.276579596 +0000 UTC m=+153.025809123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.833356 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.842571 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:53 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:53 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:53 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.842693 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.863681 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.876946 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.877517 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.37749439 +0000 UTC m=+153.126723917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.903263 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6xht6" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.978603 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:53 crc kubenswrapper[4882]: E1002 16:19:53.981091 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.481076686 +0000 UTC m=+153.230306213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.987491 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.992899 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-42mqg"] Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.994591 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:53 crc kubenswrapper[4882]: I1002 16:19:53.994828 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.000730 4882 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f73bba_eae3_4e14_8cc0_2d4fe743b0d3.slice/crio-conmon-12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9712373b_49f5_4e5e_9309_2031e7a680fa.slice/crio-d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d.scope\": RecentStats: unable to find data in memory cache]" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.005815 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.020475 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42mqg"] Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.084851 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-secret-volume\") pod \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.084897 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfctr\" (UniqueName: \"kubernetes.io/projected/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-kube-api-access-kfctr\") pod \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.084995 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.085048 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-config-volume\") pod \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\" (UID: \"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f\") " Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.085179 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-catalog-content\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.085284 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-utilities\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.085338 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5bxs\" (UniqueName: \"kubernetes.io/projected/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-kube-api-access-z5bxs\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.088421 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f" (UID: "e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.088645 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.58861475 +0000 UTC m=+153.337844447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.119003 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-kube-api-access-kfctr" (OuterVolumeSpecName: "kube-api-access-kfctr") pod "e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f" (UID: "e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f"). InnerVolumeSpecName "kube-api-access-kfctr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.119281 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f" (UID: "e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.186504 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-catalog-content\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.187054 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-utilities\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.187080 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.187119 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5bxs\" (UniqueName: \"kubernetes.io/projected/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-kube-api-access-z5bxs\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.187178 4882 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.187189 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfctr\" (UniqueName: \"kubernetes.io/projected/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-kube-api-access-kfctr\") on node \"crc\" DevicePath \"\"" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.187198 4882 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.187431 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-utilities\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.187582 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.687564771 +0000 UTC m=+153.436794298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.187825 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-catalog-content\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.196926 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4s5w" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.212319 4882 generic.go:334] "Generic (PLEG): container finished" podID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerID="12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e" exitCode=0 Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.212389 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8p45" event={"ID":"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3","Type":"ContainerDied","Data":"12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.212414 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8p45" event={"ID":"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3","Type":"ContainerStarted","Data":"c098735f392da036bbe8de813de8c781623839d5a66c53ca02115f601538e8d1"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.213756 4882 generic.go:334] "Generic (PLEG): container finished" podID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerID="d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d" exitCode=0 Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.213792 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4hr" event={"ID":"9712373b-49f5-4e5e-9309-2031e7a680fa","Type":"ContainerDied","Data":"d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.213805 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4hr" event={"ID":"9712373b-49f5-4e5e-9309-2031e7a680fa","Type":"ContainerStarted","Data":"d20592e6c92d42472cd773eca00f0d8ab8c5975e05f3e768048ebcdeccfc37af"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.213994 4882 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.220280 4882 generic.go:334] "Generic (PLEG): container finished" podID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerID="d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde" exitCode=0 Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.220365 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kvxtr" event={"ID":"250ffd8d-e281-4666-838c-2d351c64a6a6","Type":"ContainerDied","Data":"d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.220393 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kvxtr" event={"ID":"250ffd8d-e281-4666-838c-2d351c64a6a6","Type":"ContainerStarted","Data":"0c1ee458867c18848d8c3d1c33e14638b76717e9717a429f7e0146d3267fc233"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.221965 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5bxs\" (UniqueName: \"kubernetes.io/projected/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-kube-api-access-z5bxs\") pod \"redhat-marketplace-42mqg\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.225659 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkjb6" event={"ID":"02587579-5fbd-4019-afdc-5fb1b713adbb","Type":"ContainerStarted","Data":"ba28f8c5a492470c9ae2eeffd9e239799ba10cec0a8d3461aebf34d319c884d7"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.225699 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkjb6" event={"ID":"02587579-5fbd-4019-afdc-5fb1b713adbb","Type":"ContainerStarted","Data":"511c4d84abb71b6d026368a3596407aa016dad16cf1560d83246b14321e285e6"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.234819 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" event={"ID":"e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f","Type":"ContainerDied","Data":"4afc7db724b9fbd63dd8cd60e68b8d9b9a7fbcd73b215beb42de5732674624e3"} Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.234878 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4afc7db724b9fbd63dd8cd60e68b8d9b9a7fbcd73b215beb42de5732674624e3" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.234973 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.260029 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jbrgv" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.288916 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.289328 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.789290207 +0000 UTC m=+153.538519734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.289429 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.290723 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.790683035 +0000 UTC m=+153.539912562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.331920 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.340887 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wll9x"] Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.341389 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f" containerName="collect-profiles" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.341416 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f" containerName="collect-profiles" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.341561 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f" containerName="collect-profiles" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.342658 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.346183 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wll9x"] Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.395148 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.395598 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.895576807 +0000 UTC m=+153.644806334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.395820 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.396176 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:54.896168543 +0000 UTC m=+153.645398070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.422312 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.423864 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.443479 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.443926 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.444849 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.504119 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.504663 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-utilities\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.504715 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-catalog-content\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.504740 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.504845 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:55.004787388 +0000 UTC m=+153.754016915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.504999 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.505397 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.505426 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:55.005418925 +0000 UTC m=+153.754648442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.505532 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vksbp\" (UniqueName: \"kubernetes.io/projected/f3197c93-7ccd-41ae-a24c-1f2f8e097075-kube-api-access-vksbp\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.594916 4882 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.608409 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.609197 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.609318 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.609379 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vksbp\" (UniqueName: \"kubernetes.io/projected/f3197c93-7ccd-41ae-a24c-1f2f8e097075-kube-api-access-vksbp\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.609445 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-utilities\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.609490 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-catalog-content\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.609696 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.610385 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-catalog-content\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.609856 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:55.109822394 +0000 UTC m=+153.859051921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.610690 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-utilities\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.639745 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vksbp\" (UniqueName: \"kubernetes.io/projected/f3197c93-7ccd-41ae-a24c-1f2f8e097075-kube-api-access-vksbp\") pod \"redhat-marketplace-wll9x\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.652619 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.674539 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.704039 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42mqg"] Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.713407 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.713979 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:55.213959535 +0000 UTC m=+153.963189072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.773588 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.814736 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.815053 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:55.315014893 +0000 UTC m=+154.064244420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.815447 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.815801 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 16:19:55.315783774 +0000 UTC m=+154.065013301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fl2p9" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.843891 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:54 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:54 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:54 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.843972 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.917068 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:54 crc kubenswrapper[4882]: E1002 16:19:54.917484 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 16:19:55.417467459 +0000 UTC m=+154.166696986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 16:19:54 crc kubenswrapper[4882]: I1002 16:19:54.976453 4882 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T16:19:54.595339038Z","Handler":null,"Name":""} Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.020479 4882 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.020529 4882 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.024355 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wll9x"] Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.026294 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.031602 4882 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.031654 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.077891 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fl2p9\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.127764 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.135785 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.153981 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.190810 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.261814 4882 generic.go:334] "Generic (PLEG): container finished" podID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerID="3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021" exitCode=0 Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.262366 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42mqg" event={"ID":"bdf5160b-ad14-4521-9a46-ff205e5a2cd1","Type":"ContainerDied","Data":"3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021"} Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.262444 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42mqg" event={"ID":"bdf5160b-ad14-4521-9a46-ff205e5a2cd1","Type":"ContainerStarted","Data":"a35fe7e1af2e0ed997ca4e1b7c4ee1b8dcc03a7c9f576c9664c62067eb83bc3c"} Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.266288 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"73e3dd00-0e93-4ac5-b12f-9ff3843a635c","Type":"ContainerStarted","Data":"cbd1a4f930158117f748b5b25b84150a8836be1913fd78c4c90ed7b9edfb6653"} Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.273627 4882 generic.go:334] "Generic (PLEG): container finished" podID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerID="ba28f8c5a492470c9ae2eeffd9e239799ba10cec0a8d3461aebf34d319c884d7" exitCode=0 Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.273756 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkjb6" event={"ID":"02587579-5fbd-4019-afdc-5fb1b713adbb","Type":"ContainerDied","Data":"ba28f8c5a492470c9ae2eeffd9e239799ba10cec0a8d3461aebf34d319c884d7"} Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.283188 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" event={"ID":"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4","Type":"ContainerStarted","Data":"8f3d8b812482061fc20e298d83fdfd87c2b0d9dfbe3e039b64a6a504e918efea"} Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.283279 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" event={"ID":"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4","Type":"ContainerStarted","Data":"fef6122b23651460d111be5b1ab07575589c28cd58918ef1c33a09574bd3583f"} Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.309598 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wll9x" event={"ID":"f3197c93-7ccd-41ae-a24c-1f2f8e097075","Type":"ContainerStarted","Data":"637e8414b13816af1fceaa3cfa91eeed3021da8620c637c838c315f77e16c15e"} Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.329797 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bvdff"] Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.337091 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.341845 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.351142 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvdff"] Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.432064 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-utilities\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.432285 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-catalog-content\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.432321 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xblmm\" (UniqueName: \"kubernetes.io/projected/ffca683c-4413-46fe-a196-d44d174991bf-kube-api-access-xblmm\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.460633 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xhcsv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.502449 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl2p9"] Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.542135 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-catalog-content\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.542741 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xblmm\" (UniqueName: \"kubernetes.io/projected/ffca683c-4413-46fe-a196-d44d174991bf-kube-api-access-xblmm\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.542782 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-utilities\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.544894 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-utilities\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.545049 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-catalog-content\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.579372 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xblmm\" (UniqueName: \"kubernetes.io/projected/ffca683c-4413-46fe-a196-d44d174991bf-kube-api-access-xblmm\") pod \"redhat-operators-bvdff\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.688509 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.724689 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qxzqv"] Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.725815 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.738424 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxzqv"] Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.834271 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:55 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:55 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:55 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.834330 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.846626 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-utilities\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.846675 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nh86\" (UniqueName: \"kubernetes.io/projected/6bc24002-ad85-4877-af5f-314a91c3fb9d-kube-api-access-4nh86\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.846776 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-catalog-content\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.948448 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-catalog-content\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.948854 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-utilities\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.948886 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nh86\" (UniqueName: \"kubernetes.io/projected/6bc24002-ad85-4877-af5f-314a91c3fb9d-kube-api-access-4nh86\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.950054 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-utilities\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.955640 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-catalog-content\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:55 crc kubenswrapper[4882]: I1002 16:19:55.981132 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nh86\" (UniqueName: \"kubernetes.io/projected/6bc24002-ad85-4877-af5f-314a91c3fb9d-kube-api-access-4nh86\") pod \"redhat-operators-qxzqv\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.050876 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.198572 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvdff"] Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.330824 4882 generic.go:334] "Generic (PLEG): container finished" podID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerID="591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873" exitCode=0 Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.330945 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wll9x" event={"ID":"f3197c93-7ccd-41ae-a24c-1f2f8e097075","Type":"ContainerDied","Data":"591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873"} Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.347916 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" event={"ID":"c207eb71-c634-4cc1-b3f7-720ddbb6dc56","Type":"ContainerStarted","Data":"1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853"} Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.347969 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" event={"ID":"c207eb71-c634-4cc1-b3f7-720ddbb6dc56","Type":"ContainerStarted","Data":"edab5d4e0b3f245e89df82a16d00e20a92211e45de26510286d3404e29557a20"} Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.348277 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.353085 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"73e3dd00-0e93-4ac5-b12f-9ff3843a635c","Type":"ContainerStarted","Data":"8ba89bdb64e1cd868884297adc3f54d34c7200f765a6bf0073ceca28ca2c8596"} Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.355889 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvdff" event={"ID":"ffca683c-4413-46fe-a196-d44d174991bf","Type":"ContainerStarted","Data":"e19e5bd77d7771abaa10368415e23fcd07bf2b8bb8d8d652ce5caf38922a4781"} Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.377302 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" event={"ID":"a011224b-5e5e-4cb3-bf68-c2c5db0ab8c4","Type":"ContainerStarted","Data":"d29c10f3589a3d5e631baa7fff0fc635da061bcff93d10c4528c0f0630e56e4f"} Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.387984 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" podStartSLOduration=132.387962328 podStartE2EDuration="2m12.387962328s" podCreationTimestamp="2025-10-02 16:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:56.384876833 +0000 UTC m=+155.134106360" watchObservedRunningTime="2025-10-02 16:19:56.387962328 +0000 UTC m=+155.137191855" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.409859 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.409834564 podStartE2EDuration="2.409834564s" podCreationTimestamp="2025-10-02 16:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:56.408814257 +0000 UTC m=+155.158043784" watchObservedRunningTime="2025-10-02 16:19:56.409834564 +0000 UTC m=+155.159064091" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.441539 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-x5zj4" podStartSLOduration=15.441508819 podStartE2EDuration="15.441508819s" podCreationTimestamp="2025-10-02 16:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:19:56.437609842 +0000 UTC m=+155.186839369" watchObservedRunningTime="2025-10-02 16:19:56.441508819 +0000 UTC m=+155.190738346" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.558385 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxzqv"] Oct 02 16:19:56 crc kubenswrapper[4882]: W1002 16:19:56.589858 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc24002_ad85_4877_af5f_314a91c3fb9d.slice/crio-01bb4d4929d3e81862dc308243618f29ce9c45bc295d8d6ae48320169c85cffc WatchSource:0}: Error finding container 01bb4d4929d3e81862dc308243618f29ce9c45bc295d8d6ae48320169c85cffc: Status 404 returned error can't find the container with id 01bb4d4929d3e81862dc308243618f29ce9c45bc295d8d6ae48320169c85cffc Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.780108 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.836008 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:56 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:56 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:56 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.836110 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.893645 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:56 crc kubenswrapper[4882]: I1002 16:19:56.912859 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-snzmw" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.394364 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.396692 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.400802 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.400934 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.415287 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.470306 4882 generic.go:334] "Generic (PLEG): container finished" podID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerID="80a6d286fbf987e1b3a458167293176d9664f74a06ff83c8f62ea76c9da2c832" exitCode=0 Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.470417 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzqv" event={"ID":"6bc24002-ad85-4877-af5f-314a91c3fb9d","Type":"ContainerDied","Data":"80a6d286fbf987e1b3a458167293176d9664f74a06ff83c8f62ea76c9da2c832"} Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.470450 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzqv" event={"ID":"6bc24002-ad85-4877-af5f-314a91c3fb9d","Type":"ContainerStarted","Data":"01bb4d4929d3e81862dc308243618f29ce9c45bc295d8d6ae48320169c85cffc"} Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.482116 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.482329 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.485370 4882 generic.go:334] "Generic (PLEG): container finished" podID="73e3dd00-0e93-4ac5-b12f-9ff3843a635c" containerID="8ba89bdb64e1cd868884297adc3f54d34c7200f765a6bf0073ceca28ca2c8596" exitCode=0 Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.485581 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"73e3dd00-0e93-4ac5-b12f-9ff3843a635c","Type":"ContainerDied","Data":"8ba89bdb64e1cd868884297adc3f54d34c7200f765a6bf0073ceca28ca2c8596"} Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.492064 4882 generic.go:334] "Generic (PLEG): container finished" podID="ffca683c-4413-46fe-a196-d44d174991bf" containerID="92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8" exitCode=0 Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.492250 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvdff" event={"ID":"ffca683c-4413-46fe-a196-d44d174991bf","Type":"ContainerDied","Data":"92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8"} Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.584494 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.585383 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.588580 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.627692 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.751537 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.835345 4882 patch_prober.go:28] interesting pod/router-default-5444994796-vf2qc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 16:19:57 crc kubenswrapper[4882]: [-]has-synced failed: reason withheld Oct 02 16:19:57 crc kubenswrapper[4882]: [+]process-running ok Oct 02 16:19:57 crc kubenswrapper[4882]: healthz check failed Oct 02 16:19:57 crc kubenswrapper[4882]: I1002 16:19:57.835595 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vf2qc" podUID="96b0dd64-7e97-4cac-bcd1-1cc312027a48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.151906 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 16:19:58 crc kubenswrapper[4882]: W1002 16:19:58.166155 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5b6acf21_cfbb_4d19_9a42_5931eba44aaa.slice/crio-56e02591cbec656614295073713e4bc8012f6db1a6fee62f18483eff5d081f1c WatchSource:0}: Error finding container 56e02591cbec656614295073713e4bc8012f6db1a6fee62f18483eff5d081f1c: Status 404 returned error can't find the container with id 56e02591cbec656614295073713e4bc8012f6db1a6fee62f18483eff5d081f1c Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.504164 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b6acf21-cfbb-4d19-9a42-5931eba44aaa","Type":"ContainerStarted","Data":"56e02591cbec656614295073713e4bc8012f6db1a6fee62f18483eff5d081f1c"} Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.791377 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6wpc8" Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.835136 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.839093 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vf2qc" Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.902530 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.915421 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kubelet-dir\") pod \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\" (UID: \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\") " Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.915584 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "73e3dd00-0e93-4ac5-b12f-9ff3843a635c" (UID: "73e3dd00-0e93-4ac5-b12f-9ff3843a635c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.915776 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kube-api-access\") pod \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\" (UID: \"73e3dd00-0e93-4ac5-b12f-9ff3843a635c\") " Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.919423 4882 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 16:19:58 crc kubenswrapper[4882]: I1002 16:19:58.926577 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "73e3dd00-0e93-4ac5-b12f-9ff3843a635c" (UID: "73e3dd00-0e93-4ac5-b12f-9ff3843a635c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:19:59 crc kubenswrapper[4882]: I1002 16:19:59.021896 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/73e3dd00-0e93-4ac5-b12f-9ff3843a635c-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 16:19:59 crc kubenswrapper[4882]: I1002 16:19:59.540287 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b6acf21-cfbb-4d19-9a42-5931eba44aaa","Type":"ContainerStarted","Data":"dce8f7b0f914b46add034c254a3b9a9c6bd0dcac1395ce38fc61cdf0bd425a05"} Oct 02 16:19:59 crc kubenswrapper[4882]: I1002 16:19:59.559806 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 16:19:59 crc kubenswrapper[4882]: I1002 16:19:59.559800 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"73e3dd00-0e93-4ac5-b12f-9ff3843a635c","Type":"ContainerDied","Data":"cbd1a4f930158117f748b5b25b84150a8836be1913fd78c4c90ed7b9edfb6653"} Oct 02 16:19:59 crc kubenswrapper[4882]: I1002 16:19:59.559897 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbd1a4f930158117f748b5b25b84150a8836be1913fd78c4c90ed7b9edfb6653" Oct 02 16:20:00 crc kubenswrapper[4882]: I1002 16:20:00.578941 4882 generic.go:334] "Generic (PLEG): container finished" podID="5b6acf21-cfbb-4d19-9a42-5931eba44aaa" containerID="dce8f7b0f914b46add034c254a3b9a9c6bd0dcac1395ce38fc61cdf0bd425a05" exitCode=0 Oct 02 16:20:00 crc kubenswrapper[4882]: I1002 16:20:00.579574 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b6acf21-cfbb-4d19-9a42-5931eba44aaa","Type":"ContainerDied","Data":"dce8f7b0f914b46add034c254a3b9a9c6bd0dcac1395ce38fc61cdf0bd425a05"} Oct 02 16:20:01 crc kubenswrapper[4882]: I1002 16:20:01.912108 4882 patch_prober.go:28] interesting pod/console-f9d7485db-9f6pj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 02 16:20:01 crc kubenswrapper[4882]: I1002 16:20:01.912574 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9f6pj" podUID="c5af616c-8948-402c-97b8-3aadd17673d2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 02 16:20:03 crc kubenswrapper[4882]: I1002 16:20:03.447048 4882 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n7gp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 02 16:20:03 crc kubenswrapper[4882]: I1002 16:20:03.447672 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6n7gp" podUID="41edc773-9f85-408f-9605-a86000b41aa2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 02 16:20:03 crc kubenswrapper[4882]: I1002 16:20:03.447153 4882 patch_prober.go:28] interesting pod/downloads-7954f5f757-6n7gp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 02 16:20:03 crc kubenswrapper[4882]: I1002 16:20:03.447752 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6n7gp" podUID="41edc773-9f85-408f-9605-a86000b41aa2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 02 16:20:06 crc kubenswrapper[4882]: I1002 16:20:06.447580 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:20:06 crc kubenswrapper[4882]: I1002 16:20:06.457183 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f988cab-7579-4a12-8df6-e3e91e42f7df-metrics-certs\") pod \"network-metrics-daemon-6ldvk\" (UID: \"9f988cab-7579-4a12-8df6-e3e91e42f7df\") " pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:20:06 crc kubenswrapper[4882]: I1002 16:20:06.577279 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ldvk" Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.578280 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.678464 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b6acf21-cfbb-4d19-9a42-5931eba44aaa","Type":"ContainerDied","Data":"56e02591cbec656614295073713e4bc8012f6db1a6fee62f18483eff5d081f1c"} Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.678513 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e02591cbec656614295073713e4bc8012f6db1a6fee62f18483eff5d081f1c" Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.678588 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.688309 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kubelet-dir\") pod \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\" (UID: \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\") " Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.688390 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kube-api-access\") pod \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\" (UID: \"5b6acf21-cfbb-4d19-9a42-5931eba44aaa\") " Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.688428 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5b6acf21-cfbb-4d19-9a42-5931eba44aaa" (UID: "5b6acf21-cfbb-4d19-9a42-5931eba44aaa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.688819 4882 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.694760 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5b6acf21-cfbb-4d19-9a42-5931eba44aaa" (UID: "5b6acf21-cfbb-4d19-9a42-5931eba44aaa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:20:08 crc kubenswrapper[4882]: I1002 16:20:08.790114 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b6acf21-cfbb-4d19-9a42-5931eba44aaa-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 16:20:09 crc kubenswrapper[4882]: I1002 16:20:09.390726 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:20:09 crc kubenswrapper[4882]: I1002 16:20:09.390828 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:20:11 crc kubenswrapper[4882]: I1002 16:20:11.916567 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:20:11 crc kubenswrapper[4882]: I1002 16:20:11.922014 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:20:13 crc kubenswrapper[4882]: I1002 16:20:13.456073 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6n7gp" Oct 02 16:20:15 crc kubenswrapper[4882]: I1002 16:20:15.163545 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:20:24 crc kubenswrapper[4882]: I1002 16:20:24.254960 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwrcp" Oct 02 16:20:30 crc kubenswrapper[4882]: I1002 16:20:30.132755 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 16:20:36 crc kubenswrapper[4882]: E1002 16:20:36.727262 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 16:20:36 crc kubenswrapper[4882]: E1002 16:20:36.727973 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4nh86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qxzqv_openshift-marketplace(6bc24002-ad85-4877-af5f-314a91c3fb9d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 16:20:36 crc kubenswrapper[4882]: E1002 16:20:36.729172 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qxzqv" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" Oct 02 16:20:38 crc kubenswrapper[4882]: E1002 16:20:38.984339 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qxzqv" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" Oct 02 16:20:39 crc kubenswrapper[4882]: I1002 16:20:39.390577 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:20:39 crc kubenswrapper[4882]: I1002 16:20:39.390715 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:20:40 crc kubenswrapper[4882]: E1002 16:20:40.913554 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 16:20:40 crc kubenswrapper[4882]: E1002 16:20:40.914298 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tgjgn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x8p45_openshift-marketplace(f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 16:20:40 crc kubenswrapper[4882]: E1002 16:20:40.915552 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x8p45" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" Oct 02 16:20:42 crc kubenswrapper[4882]: E1002 16:20:42.101372 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x8p45" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" Oct 02 16:20:44 crc kubenswrapper[4882]: E1002 16:20:44.862659 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 16:20:44 crc kubenswrapper[4882]: E1002 16:20:44.863531 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgzwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dn4hr_openshift-marketplace(9712373b-49f5-4e5e-9309-2031e7a680fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 16:20:44 crc kubenswrapper[4882]: E1002 16:20:44.864763 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dn4hr" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" Oct 02 16:20:44 crc kubenswrapper[4882]: E1002 16:20:44.958241 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 16:20:44 crc kubenswrapper[4882]: E1002 16:20:44.958542 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z76tp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kvxtr_openshift-marketplace(250ffd8d-e281-4666-838c-2d351c64a6a6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 16:20:44 crc kubenswrapper[4882]: E1002 16:20:44.959828 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kvxtr" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" Oct 02 16:20:45 crc kubenswrapper[4882]: E1002 16:20:45.033071 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 16:20:45 crc kubenswrapper[4882]: E1002 16:20:45.033337 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xblmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bvdff_openshift-marketplace(ffca683c-4413-46fe-a196-d44d174991bf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 16:20:45 crc kubenswrapper[4882]: E1002 16:20:45.034540 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bvdff" podUID="ffca683c-4413-46fe-a196-d44d174991bf" Oct 02 16:20:45 crc kubenswrapper[4882]: E1002 16:20:45.184820 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 16:20:45 crc kubenswrapper[4882]: E1002 16:20:45.185047 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lrxsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kkjb6_openshift-marketplace(02587579-5fbd-4019-afdc-5fb1b713adbb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 16:20:45 crc kubenswrapper[4882]: E1002 16:20:45.187069 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kkjb6" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" Oct 02 16:20:47 crc kubenswrapper[4882]: E1002 16:20:47.833455 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kkjb6" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" Oct 02 16:20:47 crc kubenswrapper[4882]: E1002 16:20:47.833999 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bvdff" podUID="ffca683c-4413-46fe-a196-d44d174991bf" Oct 02 16:20:47 crc kubenswrapper[4882]: E1002 16:20:47.834110 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kvxtr" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" Oct 02 16:20:47 crc kubenswrapper[4882]: E1002 16:20:47.834121 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dn4hr" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" Oct 02 16:20:48 crc kubenswrapper[4882]: I1002 16:20:48.224722 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6ldvk"] Oct 02 16:20:48 crc kubenswrapper[4882]: E1002 16:20:48.729532 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 16:20:48 crc kubenswrapper[4882]: E1002 16:20:48.729730 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5bxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-42mqg_openshift-marketplace(bdf5160b-ad14-4521-9a46-ff205e5a2cd1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 16:20:48 crc kubenswrapper[4882]: E1002 16:20:48.730977 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-42mqg" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" Oct 02 16:20:48 crc kubenswrapper[4882]: E1002 16:20:48.747895 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 16:20:48 crc kubenswrapper[4882]: E1002 16:20:48.748129 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vksbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wll9x_openshift-marketplace(f3197c93-7ccd-41ae-a24c-1f2f8e097075): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 16:20:48 crc kubenswrapper[4882]: E1002 16:20:48.749285 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wll9x" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" Oct 02 16:20:48 crc kubenswrapper[4882]: I1002 16:20:48.931612 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" event={"ID":"9f988cab-7579-4a12-8df6-e3e91e42f7df","Type":"ContainerStarted","Data":"351f9e44bbc5ce24ba90b58e87b90dabb64a6e904bcdcef693856868aebf79c3"} Oct 02 16:20:48 crc kubenswrapper[4882]: I1002 16:20:48.932161 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" event={"ID":"9f988cab-7579-4a12-8df6-e3e91e42f7df","Type":"ContainerStarted","Data":"ab4157d556e01f4ffcac06e042371d84ca8e03ed0bf1c8d2632120034d3512b0"} Oct 02 16:20:48 crc kubenswrapper[4882]: E1002 16:20:48.939439 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wll9x" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" Oct 02 16:20:48 crc kubenswrapper[4882]: E1002 16:20:48.940074 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-42mqg" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" Oct 02 16:20:49 crc kubenswrapper[4882]: I1002 16:20:49.940818 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6ldvk" event={"ID":"9f988cab-7579-4a12-8df6-e3e91e42f7df","Type":"ContainerStarted","Data":"d77c4b2a44fe626a9a39faefac3f85ff43e474a5ae5f65aeaa52c5f9fe3eba29"} Oct 02 16:20:49 crc kubenswrapper[4882]: I1002 16:20:49.963408 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6ldvk" podStartSLOduration=186.963386217 podStartE2EDuration="3m6.963386217s" podCreationTimestamp="2025-10-02 16:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:20:49.962251467 +0000 UTC m=+208.711481014" watchObservedRunningTime="2025-10-02 16:20:49.963386217 +0000 UTC m=+208.712615744" Oct 02 16:20:53 crc kubenswrapper[4882]: I1002 16:20:53.985388 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzqv" event={"ID":"6bc24002-ad85-4877-af5f-314a91c3fb9d","Type":"ContainerStarted","Data":"700919ebc4e04458a176c8ca7fa2cc86f358ad8e8a13d02418f89ff8dd0b13f6"} Oct 02 16:20:54 crc kubenswrapper[4882]: I1002 16:20:54.992283 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8p45" event={"ID":"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3","Type":"ContainerStarted","Data":"a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603"} Oct 02 16:20:54 crc kubenswrapper[4882]: I1002 16:20:54.995293 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzqv" event={"ID":"6bc24002-ad85-4877-af5f-314a91c3fb9d","Type":"ContainerDied","Data":"700919ebc4e04458a176c8ca7fa2cc86f358ad8e8a13d02418f89ff8dd0b13f6"} Oct 02 16:20:54 crc kubenswrapper[4882]: I1002 16:20:54.995299 4882 generic.go:334] "Generic (PLEG): container finished" podID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerID="700919ebc4e04458a176c8ca7fa2cc86f358ad8e8a13d02418f89ff8dd0b13f6" exitCode=0 Oct 02 16:20:56 crc kubenswrapper[4882]: I1002 16:20:56.004204 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzqv" event={"ID":"6bc24002-ad85-4877-af5f-314a91c3fb9d","Type":"ContainerStarted","Data":"128e9afbd35e0d572000db2ef0e6ca8f32a2a6c4ec78fd1203dce294d014ea3b"} Oct 02 16:20:56 crc kubenswrapper[4882]: I1002 16:20:56.006139 4882 generic.go:334] "Generic (PLEG): container finished" podID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerID="a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603" exitCode=0 Oct 02 16:20:56 crc kubenswrapper[4882]: I1002 16:20:56.006173 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8p45" event={"ID":"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3","Type":"ContainerDied","Data":"a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603"} Oct 02 16:20:56 crc kubenswrapper[4882]: I1002 16:20:56.027354 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qxzqv" podStartSLOduration=3.044248917 podStartE2EDuration="1m1.027329799s" podCreationTimestamp="2025-10-02 16:19:55 +0000 UTC" firstStartedPulling="2025-10-02 16:19:57.475099344 +0000 UTC m=+156.224328871" lastFinishedPulling="2025-10-02 16:20:55.458180216 +0000 UTC m=+214.207409753" observedRunningTime="2025-10-02 16:20:56.024592878 +0000 UTC m=+214.773822415" watchObservedRunningTime="2025-10-02 16:20:56.027329799 +0000 UTC m=+214.776559336" Oct 02 16:20:56 crc kubenswrapper[4882]: I1002 16:20:56.051908 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:20:56 crc kubenswrapper[4882]: I1002 16:20:56.051973 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:20:57 crc kubenswrapper[4882]: I1002 16:20:57.172553 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxzqv" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="registry-server" probeResult="failure" output=< Oct 02 16:20:57 crc kubenswrapper[4882]: timeout: failed to connect service ":50051" within 1s Oct 02 16:20:57 crc kubenswrapper[4882]: > Oct 02 16:20:58 crc kubenswrapper[4882]: I1002 16:20:58.020007 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8p45" event={"ID":"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3","Type":"ContainerStarted","Data":"59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef"} Oct 02 16:20:58 crc kubenswrapper[4882]: I1002 16:20:58.051435 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x8p45" podStartSLOduration=4.741914012 podStartE2EDuration="1m7.05141041s" podCreationTimestamp="2025-10-02 16:19:51 +0000 UTC" firstStartedPulling="2025-10-02 16:19:54.213653313 +0000 UTC m=+152.962882840" lastFinishedPulling="2025-10-02 16:20:56.523149701 +0000 UTC m=+215.272379238" observedRunningTime="2025-10-02 16:20:58.046780371 +0000 UTC m=+216.796009928" watchObservedRunningTime="2025-10-02 16:20:58.05141041 +0000 UTC m=+216.800639957" Oct 02 16:21:00 crc kubenswrapper[4882]: I1002 16:21:00.034170 4882 generic.go:334] "Generic (PLEG): container finished" podID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerID="b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd" exitCode=0 Oct 02 16:21:00 crc kubenswrapper[4882]: I1002 16:21:00.034497 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4hr" event={"ID":"9712373b-49f5-4e5e-9309-2031e7a680fa","Type":"ContainerDied","Data":"b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd"} Oct 02 16:21:01 crc kubenswrapper[4882]: I1002 16:21:01.046293 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4hr" event={"ID":"9712373b-49f5-4e5e-9309-2031e7a680fa","Type":"ContainerStarted","Data":"baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2"} Oct 02 16:21:01 crc kubenswrapper[4882]: I1002 16:21:01.048783 4882 generic.go:334] "Generic (PLEG): container finished" podID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerID="5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df" exitCode=0 Oct 02 16:21:01 crc kubenswrapper[4882]: I1002 16:21:01.048869 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kvxtr" event={"ID":"250ffd8d-e281-4666-838c-2d351c64a6a6","Type":"ContainerDied","Data":"5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df"} Oct 02 16:21:01 crc kubenswrapper[4882]: I1002 16:21:01.076274 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dn4hr" podStartSLOduration=2.771549047 podStartE2EDuration="1m9.07614908s" podCreationTimestamp="2025-10-02 16:19:52 +0000 UTC" firstStartedPulling="2025-10-02 16:19:54.215167134 +0000 UTC m=+152.964396661" lastFinishedPulling="2025-10-02 16:21:00.519767147 +0000 UTC m=+219.268996694" observedRunningTime="2025-10-02 16:21:01.063816543 +0000 UTC m=+219.813046060" watchObservedRunningTime="2025-10-02 16:21:01.07614908 +0000 UTC m=+219.825378627" Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.056949 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkjb6" event={"ID":"02587579-5fbd-4019-afdc-5fb1b713adbb","Type":"ContainerStarted","Data":"1cd297f506fffb28a6133d151194bb0d34db658254f783e7ddf3e9623603ec03"} Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.059726 4882 generic.go:334] "Generic (PLEG): container finished" podID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerID="3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774" exitCode=0 Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.059810 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wll9x" event={"ID":"f3197c93-7ccd-41ae-a24c-1f2f8e097075","Type":"ContainerDied","Data":"3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774"} Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.066426 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kvxtr" event={"ID":"250ffd8d-e281-4666-838c-2d351c64a6a6","Type":"ContainerStarted","Data":"5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd"} Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.107239 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kvxtr" podStartSLOduration=2.73816694 podStartE2EDuration="1m10.107195361s" podCreationTimestamp="2025-10-02 16:19:52 +0000 UTC" firstStartedPulling="2025-10-02 16:19:54.223494372 +0000 UTC m=+152.972723889" lastFinishedPulling="2025-10-02 16:21:01.592522783 +0000 UTC m=+220.341752310" observedRunningTime="2025-10-02 16:21:02.105184249 +0000 UTC m=+220.854413796" watchObservedRunningTime="2025-10-02 16:21:02.107195361 +0000 UTC m=+220.856424888" Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.289480 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.289576 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.343920 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.449892 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.449946 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.688555 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:21:02 crc kubenswrapper[4882]: I1002 16:21:02.688629 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:21:03 crc kubenswrapper[4882]: I1002 16:21:03.073980 4882 generic.go:334] "Generic (PLEG): container finished" podID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerID="1cd297f506fffb28a6133d151194bb0d34db658254f783e7ddf3e9623603ec03" exitCode=0 Oct 02 16:21:03 crc kubenswrapper[4882]: I1002 16:21:03.074058 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkjb6" event={"ID":"02587579-5fbd-4019-afdc-5fb1b713adbb","Type":"ContainerDied","Data":"1cd297f506fffb28a6133d151194bb0d34db658254f783e7ddf3e9623603ec03"} Oct 02 16:21:03 crc kubenswrapper[4882]: I1002 16:21:03.078800 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wll9x" event={"ID":"f3197c93-7ccd-41ae-a24c-1f2f8e097075","Type":"ContainerStarted","Data":"a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43"} Oct 02 16:21:03 crc kubenswrapper[4882]: I1002 16:21:03.145707 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:21:03 crc kubenswrapper[4882]: I1002 16:21:03.501392 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dn4hr" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="registry-server" probeResult="failure" output=< Oct 02 16:21:03 crc kubenswrapper[4882]: timeout: failed to connect service ":50051" within 1s Oct 02 16:21:03 crc kubenswrapper[4882]: > Oct 02 16:21:03 crc kubenswrapper[4882]: I1002 16:21:03.737185 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kvxtr" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="registry-server" probeResult="failure" output=< Oct 02 16:21:03 crc kubenswrapper[4882]: timeout: failed to connect service ":50051" within 1s Oct 02 16:21:03 crc kubenswrapper[4882]: > Oct 02 16:21:04 crc kubenswrapper[4882]: I1002 16:21:04.104530 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wll9x" podStartSLOduration=3.990317255 podStartE2EDuration="1m10.104508023s" podCreationTimestamp="2025-10-02 16:19:54 +0000 UTC" firstStartedPulling="2025-10-02 16:19:56.335759782 +0000 UTC m=+155.084989309" lastFinishedPulling="2025-10-02 16:21:02.44995054 +0000 UTC m=+221.199180077" observedRunningTime="2025-10-02 16:21:04.100686225 +0000 UTC m=+222.849915842" watchObservedRunningTime="2025-10-02 16:21:04.104508023 +0000 UTC m=+222.853737560" Oct 02 16:21:04 crc kubenswrapper[4882]: I1002 16:21:04.675820 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:21:04 crc kubenswrapper[4882]: I1002 16:21:04.678753 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:21:04 crc kubenswrapper[4882]: I1002 16:21:04.736061 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:21:06 crc kubenswrapper[4882]: I1002 16:21:06.102980 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:21:06 crc kubenswrapper[4882]: I1002 16:21:06.172293 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:21:09 crc kubenswrapper[4882]: I1002 16:21:09.390878 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:21:09 crc kubenswrapper[4882]: I1002 16:21:09.390993 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:21:09 crc kubenswrapper[4882]: I1002 16:21:09.391076 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:21:09 crc kubenswrapper[4882]: I1002 16:21:09.392073 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:21:09 crc kubenswrapper[4882]: I1002 16:21:09.392253 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48" gracePeriod=600 Oct 02 16:21:09 crc kubenswrapper[4882]: I1002 16:21:09.694864 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxzqv"] Oct 02 16:21:09 crc kubenswrapper[4882]: I1002 16:21:09.695809 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qxzqv" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="registry-server" containerID="cri-o://128e9afbd35e0d572000db2ef0e6ca8f32a2a6c4ec78fd1203dce294d014ea3b" gracePeriod=2 Oct 02 16:21:10 crc kubenswrapper[4882]: I1002 16:21:10.129378 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48" exitCode=0 Oct 02 16:21:10 crc kubenswrapper[4882]: I1002 16:21:10.129434 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48"} Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.137671 4882 generic.go:334] "Generic (PLEG): container finished" podID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerID="128e9afbd35e0d572000db2ef0e6ca8f32a2a6c4ec78fd1203dce294d014ea3b" exitCode=0 Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.137737 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzqv" event={"ID":"6bc24002-ad85-4877-af5f-314a91c3fb9d","Type":"ContainerDied","Data":"128e9afbd35e0d572000db2ef0e6ca8f32a2a6c4ec78fd1203dce294d014ea3b"} Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.381188 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.412183 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-catalog-content\") pod \"6bc24002-ad85-4877-af5f-314a91c3fb9d\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.412359 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-utilities\") pod \"6bc24002-ad85-4877-af5f-314a91c3fb9d\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.412408 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nh86\" (UniqueName: \"kubernetes.io/projected/6bc24002-ad85-4877-af5f-314a91c3fb9d-kube-api-access-4nh86\") pod \"6bc24002-ad85-4877-af5f-314a91c3fb9d\" (UID: \"6bc24002-ad85-4877-af5f-314a91c3fb9d\") " Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.420466 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc24002-ad85-4877-af5f-314a91c3fb9d-kube-api-access-4nh86" (OuterVolumeSpecName: "kube-api-access-4nh86") pod "6bc24002-ad85-4877-af5f-314a91c3fb9d" (UID: "6bc24002-ad85-4877-af5f-314a91c3fb9d"). InnerVolumeSpecName "kube-api-access-4nh86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.422035 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-utilities" (OuterVolumeSpecName: "utilities") pod "6bc24002-ad85-4877-af5f-314a91c3fb9d" (UID: "6bc24002-ad85-4877-af5f-314a91c3fb9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.515525 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nh86\" (UniqueName: \"kubernetes.io/projected/6bc24002-ad85-4877-af5f-314a91c3fb9d-kube-api-access-4nh86\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.515575 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.691958 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bc24002-ad85-4877-af5f-314a91c3fb9d" (UID: "6bc24002-ad85-4877-af5f-314a91c3fb9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:21:11 crc kubenswrapper[4882]: I1002 16:21:11.717753 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc24002-ad85-4877-af5f-314a91c3fb9d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.148790 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzqv" event={"ID":"6bc24002-ad85-4877-af5f-314a91c3fb9d","Type":"ContainerDied","Data":"01bb4d4929d3e81862dc308243618f29ce9c45bc295d8d6ae48320169c85cffc"} Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.148872 4882 scope.go:117] "RemoveContainer" containerID="128e9afbd35e0d572000db2ef0e6ca8f32a2a6c4ec78fd1203dce294d014ea3b" Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.148934 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzqv" Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.193791 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxzqv"] Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.198353 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qxzqv"] Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.493300 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.535980 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.727754 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.800188 4882 scope.go:117] "RemoveContainer" containerID="700919ebc4e04458a176c8ca7fa2cc86f358ad8e8a13d02418f89ff8dd0b13f6" Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.881016 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" path="/var/lib/kubelet/pods/6bc24002-ad85-4877-af5f-314a91c3fb9d/volumes" Oct 02 16:21:12 crc kubenswrapper[4882]: I1002 16:21:12.881917 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:21:13 crc kubenswrapper[4882]: I1002 16:21:13.437538 4882 scope.go:117] "RemoveContainer" containerID="80a6d286fbf987e1b3a458167293176d9664f74a06ff83c8f62ea76c9da2c832" Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.176752 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkjb6" event={"ID":"02587579-5fbd-4019-afdc-5fb1b713adbb","Type":"ContainerStarted","Data":"d41e9fb82262198d87482a4754acabc1c1ad095efbaba1c9cab0dbafb3618188"} Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.181525 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"77c24662004c4356573f7635f0fdd3bb521f2ad2a8e7e5b7d9c1f4117b51aef8"} Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.183791 4882 generic.go:334] "Generic (PLEG): container finished" podID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerID="93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c" exitCode=0 Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.183835 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42mqg" event={"ID":"bdf5160b-ad14-4521-9a46-ff205e5a2cd1","Type":"ContainerDied","Data":"93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c"} Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.189946 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvdff" event={"ID":"ffca683c-4413-46fe-a196-d44d174991bf","Type":"ContainerStarted","Data":"ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98"} Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.237406 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kkjb6" podStartSLOduration=4.803779173 podStartE2EDuration="1m22.237385805s" podCreationTimestamp="2025-10-02 16:19:52 +0000 UTC" firstStartedPulling="2025-10-02 16:19:54.230253066 +0000 UTC m=+152.979482583" lastFinishedPulling="2025-10-02 16:21:11.663859688 +0000 UTC m=+230.413089215" observedRunningTime="2025-10-02 16:21:14.236532222 +0000 UTC m=+232.985761749" watchObservedRunningTime="2025-10-02 16:21:14.237385805 +0000 UTC m=+232.986615342" Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.699846 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kvxtr"] Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.700971 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kvxtr" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="registry-server" containerID="cri-o://5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd" gracePeriod=2 Oct 02 16:21:14 crc kubenswrapper[4882]: I1002 16:21:14.750479 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.163470 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.198103 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42mqg" event={"ID":"bdf5160b-ad14-4521-9a46-ff205e5a2cd1","Type":"ContainerStarted","Data":"37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471"} Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.203728 4882 generic.go:334] "Generic (PLEG): container finished" podID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerID="5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd" exitCode=0 Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.204011 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kvxtr" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.204033 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kvxtr" event={"ID":"250ffd8d-e281-4666-838c-2d351c64a6a6","Type":"ContainerDied","Data":"5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd"} Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.204859 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kvxtr" event={"ID":"250ffd8d-e281-4666-838c-2d351c64a6a6","Type":"ContainerDied","Data":"0c1ee458867c18848d8c3d1c33e14638b76717e9717a429f7e0146d3267fc233"} Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.204947 4882 scope.go:117] "RemoveContainer" containerID="5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.211589 4882 generic.go:334] "Generic (PLEG): container finished" podID="ffca683c-4413-46fe-a196-d44d174991bf" containerID="ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98" exitCode=0 Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.212631 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvdff" event={"ID":"ffca683c-4413-46fe-a196-d44d174991bf","Type":"ContainerDied","Data":"ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98"} Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.238465 4882 scope.go:117] "RemoveContainer" containerID="5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.252460 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-42mqg" podStartSLOduration=2.718445885 podStartE2EDuration="1m22.252434813s" podCreationTimestamp="2025-10-02 16:19:53 +0000 UTC" firstStartedPulling="2025-10-02 16:19:55.265534917 +0000 UTC m=+154.014764444" lastFinishedPulling="2025-10-02 16:21:14.799523845 +0000 UTC m=+233.548753372" observedRunningTime="2025-10-02 16:21:15.224202335 +0000 UTC m=+233.973431862" watchObservedRunningTime="2025-10-02 16:21:15.252434813 +0000 UTC m=+234.001664340" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.267128 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-utilities\") pod \"250ffd8d-e281-4666-838c-2d351c64a6a6\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.267194 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-catalog-content\") pod \"250ffd8d-e281-4666-838c-2d351c64a6a6\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.267386 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z76tp\" (UniqueName: \"kubernetes.io/projected/250ffd8d-e281-4666-838c-2d351c64a6a6-kube-api-access-z76tp\") pod \"250ffd8d-e281-4666-838c-2d351c64a6a6\" (UID: \"250ffd8d-e281-4666-838c-2d351c64a6a6\") " Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.269310 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-utilities" (OuterVolumeSpecName: "utilities") pod "250ffd8d-e281-4666-838c-2d351c64a6a6" (UID: "250ffd8d-e281-4666-838c-2d351c64a6a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.275556 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250ffd8d-e281-4666-838c-2d351c64a6a6-kube-api-access-z76tp" (OuterVolumeSpecName: "kube-api-access-z76tp") pod "250ffd8d-e281-4666-838c-2d351c64a6a6" (UID: "250ffd8d-e281-4666-838c-2d351c64a6a6"). InnerVolumeSpecName "kube-api-access-z76tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.287824 4882 scope.go:117] "RemoveContainer" containerID="d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.314261 4882 scope.go:117] "RemoveContainer" containerID="5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd" Oct 02 16:21:15 crc kubenswrapper[4882]: E1002 16:21:15.315418 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd\": container with ID starting with 5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd not found: ID does not exist" containerID="5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.315467 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd"} err="failed to get container status \"5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd\": rpc error: code = NotFound desc = could not find container \"5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd\": container with ID starting with 5914ca82e23a5e5031850f314da58c4e3b7f16273e7928c5008785a4051218bd not found: ID does not exist" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.315509 4882 scope.go:117] "RemoveContainer" containerID="5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df" Oct 02 16:21:15 crc kubenswrapper[4882]: E1002 16:21:15.315991 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df\": container with ID starting with 5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df not found: ID does not exist" containerID="5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.316024 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df"} err="failed to get container status \"5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df\": rpc error: code = NotFound desc = could not find container \"5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df\": container with ID starting with 5fc342bf5f02183dd111959a68730c9a32f76c629b71cebd02fb68452057f8df not found: ID does not exist" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.316045 4882 scope.go:117] "RemoveContainer" containerID="d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde" Oct 02 16:21:15 crc kubenswrapper[4882]: E1002 16:21:15.316413 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde\": container with ID starting with d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde not found: ID does not exist" containerID="d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.316444 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde"} err="failed to get container status \"d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde\": rpc error: code = NotFound desc = could not find container \"d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde\": container with ID starting with d2a7014a0b3b82ac248ff88bad79d24c30eaaa77f2838a42b46f36843d1dbdde not found: ID does not exist" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.318089 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "250ffd8d-e281-4666-838c-2d351c64a6a6" (UID: "250ffd8d-e281-4666-838c-2d351c64a6a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.369607 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z76tp\" (UniqueName: \"kubernetes.io/projected/250ffd8d-e281-4666-838c-2d351c64a6a6-kube-api-access-z76tp\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.369666 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.369681 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250ffd8d-e281-4666-838c-2d351c64a6a6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.537673 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kvxtr"] Oct 02 16:21:15 crc kubenswrapper[4882]: I1002 16:21:15.542592 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kvxtr"] Oct 02 16:21:16 crc kubenswrapper[4882]: I1002 16:21:16.221579 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvdff" event={"ID":"ffca683c-4413-46fe-a196-d44d174991bf","Type":"ContainerStarted","Data":"f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3"} Oct 02 16:21:16 crc kubenswrapper[4882]: I1002 16:21:16.238947 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bvdff" podStartSLOduration=3.108288115 podStartE2EDuration="1m21.238924235s" podCreationTimestamp="2025-10-02 16:19:55 +0000 UTC" firstStartedPulling="2025-10-02 16:19:57.501826153 +0000 UTC m=+156.251055680" lastFinishedPulling="2025-10-02 16:21:15.632462273 +0000 UTC m=+234.381691800" observedRunningTime="2025-10-02 16:21:16.236140884 +0000 UTC m=+234.985370411" watchObservedRunningTime="2025-10-02 16:21:16.238924235 +0000 UTC m=+234.988153762" Oct 02 16:21:16 crc kubenswrapper[4882]: I1002 16:21:16.782437 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" path="/var/lib/kubelet/pods/250ffd8d-e281-4666-838c-2d351c64a6a6/volumes" Oct 02 16:21:17 crc kubenswrapper[4882]: I1002 16:21:17.691542 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wll9x"] Oct 02 16:21:17 crc kubenswrapper[4882]: I1002 16:21:17.691860 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wll9x" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerName="registry-server" containerID="cri-o://a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43" gracePeriod=2 Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.116555 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.209274 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vksbp\" (UniqueName: \"kubernetes.io/projected/f3197c93-7ccd-41ae-a24c-1f2f8e097075-kube-api-access-vksbp\") pod \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.209336 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-utilities\") pod \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.209447 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-catalog-content\") pod \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\" (UID: \"f3197c93-7ccd-41ae-a24c-1f2f8e097075\") " Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.211387 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-utilities" (OuterVolumeSpecName: "utilities") pod "f3197c93-7ccd-41ae-a24c-1f2f8e097075" (UID: "f3197c93-7ccd-41ae-a24c-1f2f8e097075"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.218624 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3197c93-7ccd-41ae-a24c-1f2f8e097075-kube-api-access-vksbp" (OuterVolumeSpecName: "kube-api-access-vksbp") pod "f3197c93-7ccd-41ae-a24c-1f2f8e097075" (UID: "f3197c93-7ccd-41ae-a24c-1f2f8e097075"). InnerVolumeSpecName "kube-api-access-vksbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.224508 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3197c93-7ccd-41ae-a24c-1f2f8e097075" (UID: "f3197c93-7ccd-41ae-a24c-1f2f8e097075"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.245854 4882 generic.go:334] "Generic (PLEG): container finished" podID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerID="a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43" exitCode=0 Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.245919 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wll9x" event={"ID":"f3197c93-7ccd-41ae-a24c-1f2f8e097075","Type":"ContainerDied","Data":"a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43"} Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.245959 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wll9x" event={"ID":"f3197c93-7ccd-41ae-a24c-1f2f8e097075","Type":"ContainerDied","Data":"637e8414b13816af1fceaa3cfa91eeed3021da8620c637c838c315f77e16c15e"} Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.245982 4882 scope.go:117] "RemoveContainer" containerID="a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.246137 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wll9x" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.274536 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wll9x"] Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.279787 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wll9x"] Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.284428 4882 scope.go:117] "RemoveContainer" containerID="3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.309418 4882 scope.go:117] "RemoveContainer" containerID="591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.311431 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.311474 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vksbp\" (UniqueName: \"kubernetes.io/projected/f3197c93-7ccd-41ae-a24c-1f2f8e097075-kube-api-access-vksbp\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.311487 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3197c93-7ccd-41ae-a24c-1f2f8e097075-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.325687 4882 scope.go:117] "RemoveContainer" containerID="a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43" Oct 02 16:21:18 crc kubenswrapper[4882]: E1002 16:21:18.329621 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43\": container with ID starting with a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43 not found: ID does not exist" containerID="a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.329664 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43"} err="failed to get container status \"a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43\": rpc error: code = NotFound desc = could not find container \"a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43\": container with ID starting with a37f653da385998d1addf625729adfeb6394e8793d6415e066aab714faf6aa43 not found: ID does not exist" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.329697 4882 scope.go:117] "RemoveContainer" containerID="3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774" Oct 02 16:21:18 crc kubenswrapper[4882]: E1002 16:21:18.330130 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774\": container with ID starting with 3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774 not found: ID does not exist" containerID="3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.330152 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774"} err="failed to get container status \"3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774\": rpc error: code = NotFound desc = could not find container \"3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774\": container with ID starting with 3e240394bee403b2f22dfbb0d7ceab8cd0c9b89fa92c2adeb7510de76b574774 not found: ID does not exist" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.330167 4882 scope.go:117] "RemoveContainer" containerID="591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873" Oct 02 16:21:18 crc kubenswrapper[4882]: E1002 16:21:18.330620 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873\": container with ID starting with 591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873 not found: ID does not exist" containerID="591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.330636 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873"} err="failed to get container status \"591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873\": rpc error: code = NotFound desc = could not find container \"591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873\": container with ID starting with 591c731eaebb0705b94a9e63bab4d62311d049d134c83776d23ffb0ff345a873 not found: ID does not exist" Oct 02 16:21:18 crc kubenswrapper[4882]: I1002 16:21:18.767958 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" path="/var/lib/kubelet/pods/f3197c93-7ccd-41ae-a24c-1f2f8e097075/volumes" Oct 02 16:21:22 crc kubenswrapper[4882]: I1002 16:21:22.444894 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wcjdw"] Oct 02 16:21:22 crc kubenswrapper[4882]: I1002 16:21:22.873932 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:21:22 crc kubenswrapper[4882]: I1002 16:21:22.874016 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:21:22 crc kubenswrapper[4882]: I1002 16:21:22.917318 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:21:23 crc kubenswrapper[4882]: I1002 16:21:23.325873 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:21:24 crc kubenswrapper[4882]: I1002 16:21:24.332239 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:21:24 crc kubenswrapper[4882]: I1002 16:21:24.332295 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:21:24 crc kubenswrapper[4882]: I1002 16:21:24.374028 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:21:24 crc kubenswrapper[4882]: I1002 16:21:24.490782 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kkjb6"] Oct 02 16:21:25 crc kubenswrapper[4882]: I1002 16:21:25.290591 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kkjb6" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerName="registry-server" containerID="cri-o://d41e9fb82262198d87482a4754acabc1c1ad095efbaba1c9cab0dbafb3618188" gracePeriod=2 Oct 02 16:21:25 crc kubenswrapper[4882]: I1002 16:21:25.365961 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:21:25 crc kubenswrapper[4882]: I1002 16:21:25.688998 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:21:25 crc kubenswrapper[4882]: I1002 16:21:25.689559 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:21:25 crc kubenswrapper[4882]: I1002 16:21:25.729862 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.296836 4882 generic.go:334] "Generic (PLEG): container finished" podID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerID="d41e9fb82262198d87482a4754acabc1c1ad095efbaba1c9cab0dbafb3618188" exitCode=0 Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.296888 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkjb6" event={"ID":"02587579-5fbd-4019-afdc-5fb1b713adbb","Type":"ContainerDied","Data":"d41e9fb82262198d87482a4754acabc1c1ad095efbaba1c9cab0dbafb3618188"} Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.349803 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.877003 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.923636 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrxsn\" (UniqueName: \"kubernetes.io/projected/02587579-5fbd-4019-afdc-5fb1b713adbb-kube-api-access-lrxsn\") pod \"02587579-5fbd-4019-afdc-5fb1b713adbb\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.923707 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-catalog-content\") pod \"02587579-5fbd-4019-afdc-5fb1b713adbb\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.923822 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-utilities\") pod \"02587579-5fbd-4019-afdc-5fb1b713adbb\" (UID: \"02587579-5fbd-4019-afdc-5fb1b713adbb\") " Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.926561 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-utilities" (OuterVolumeSpecName: "utilities") pod "02587579-5fbd-4019-afdc-5fb1b713adbb" (UID: "02587579-5fbd-4019-afdc-5fb1b713adbb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.934002 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02587579-5fbd-4019-afdc-5fb1b713adbb-kube-api-access-lrxsn" (OuterVolumeSpecName: "kube-api-access-lrxsn") pod "02587579-5fbd-4019-afdc-5fb1b713adbb" (UID: "02587579-5fbd-4019-afdc-5fb1b713adbb"). InnerVolumeSpecName "kube-api-access-lrxsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:21:26 crc kubenswrapper[4882]: I1002 16:21:26.978944 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02587579-5fbd-4019-afdc-5fb1b713adbb" (UID: "02587579-5fbd-4019-afdc-5fb1b713adbb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.025126 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.025188 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02587579-5fbd-4019-afdc-5fb1b713adbb-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.025202 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrxsn\" (UniqueName: \"kubernetes.io/projected/02587579-5fbd-4019-afdc-5fb1b713adbb-kube-api-access-lrxsn\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.304418 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkjb6" event={"ID":"02587579-5fbd-4019-afdc-5fb1b713adbb","Type":"ContainerDied","Data":"511c4d84abb71b6d026368a3596407aa016dad16cf1560d83246b14321e285e6"} Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.304516 4882 scope.go:117] "RemoveContainer" containerID="d41e9fb82262198d87482a4754acabc1c1ad095efbaba1c9cab0dbafb3618188" Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.304462 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kkjb6" Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.318896 4882 scope.go:117] "RemoveContainer" containerID="1cd297f506fffb28a6133d151194bb0d34db658254f783e7ddf3e9623603ec03" Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.334552 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kkjb6"] Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.334619 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kkjb6"] Oct 02 16:21:27 crc kubenswrapper[4882]: I1002 16:21:27.359462 4882 scope.go:117] "RemoveContainer" containerID="ba28f8c5a492470c9ae2eeffd9e239799ba10cec0a8d3461aebf34d319c884d7" Oct 02 16:21:28 crc kubenswrapper[4882]: I1002 16:21:28.767727 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" path="/var/lib/kubelet/pods/02587579-5fbd-4019-afdc-5fb1b713adbb/volumes" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.483888 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" podUID="92e16305-ea70-49fd-b269-0e36792ee6ea" containerName="oauth-openshift" containerID="cri-o://94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836" gracePeriod=15 Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.854150 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885176 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg"] Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885460 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerName="extract-utilities" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885476 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerName="extract-utilities" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885487 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerName="extract-content" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885493 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerName="extract-content" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885500 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885506 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885516 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885522 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885529 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="extract-content" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885534 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="extract-content" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885542 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerName="extract-content" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885560 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerName="extract-content" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885565 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885571 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885581 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885589 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885598 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b6acf21-cfbb-4d19-9a42-5931eba44aaa" containerName="pruner" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885603 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b6acf21-cfbb-4d19-9a42-5931eba44aaa" containerName="pruner" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885621 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="extract-content" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885627 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="extract-content" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885635 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e3dd00-0e93-4ac5-b12f-9ff3843a635c" containerName="pruner" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885641 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e3dd00-0e93-4ac5-b12f-9ff3843a635c" containerName="pruner" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885652 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="extract-utilities" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885658 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="extract-utilities" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885666 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerName="extract-utilities" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885672 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerName="extract-utilities" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885680 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e16305-ea70-49fd-b269-0e36792ee6ea" containerName="oauth-openshift" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885685 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e16305-ea70-49fd-b269-0e36792ee6ea" containerName="oauth-openshift" Oct 02 16:21:47 crc kubenswrapper[4882]: E1002 16:21:47.885697 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="extract-utilities" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885702 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="extract-utilities" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885789 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="02587579-5fbd-4019-afdc-5fb1b713adbb" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885800 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b6acf21-cfbb-4d19-9a42-5931eba44aaa" containerName="pruner" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885811 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e16305-ea70-49fd-b269-0e36792ee6ea" containerName="oauth-openshift" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885818 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="250ffd8d-e281-4666-838c-2d351c64a6a6" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885825 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3197c93-7ccd-41ae-a24c-1f2f8e097075" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885834 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc24002-ad85-4877-af5f-314a91c3fb9d" containerName="registry-server" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.885843 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e3dd00-0e93-4ac5-b12f-9ff3843a635c" containerName="pruner" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.886284 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:47 crc kubenswrapper[4882]: I1002 16:21:47.911326 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg"] Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015289 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-session\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015354 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-provider-selection\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015410 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-dir\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015438 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-policies\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015486 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-trusted-ca-bundle\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015531 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-router-certs\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015559 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-idp-0-file-data\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015592 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-serving-cert\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015624 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-login\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015652 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhp6t\" (UniqueName: \"kubernetes.io/projected/92e16305-ea70-49fd-b269-0e36792ee6ea-kube-api-access-hhp6t\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015695 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-service-ca\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015724 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-error\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015751 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-cliconfig\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015783 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-ocp-branding-template\") pod \"92e16305-ea70-49fd-b269-0e36792ee6ea\" (UID: \"92e16305-ea70-49fd-b269-0e36792ee6ea\") " Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015832 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015875 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-session\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015919 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015945 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.015972 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016006 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-login\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016040 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-audit-policies\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016063 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5287c\" (UniqueName: \"kubernetes.io/projected/581e76bb-c0c0-40eb-990f-e3a94fedec65-kube-api-access-5287c\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016082 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-error\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016104 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016126 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016149 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016184 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016249 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016273 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/581e76bb-c0c0-40eb-990f-e3a94fedec65-audit-dir\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.016316 4882 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.018349 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.019832 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.020194 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.020383 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.024063 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.024630 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.025149 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.025231 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.025631 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.025882 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.027400 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.028702 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e16305-ea70-49fd-b269-0e36792ee6ea-kube-api-access-hhp6t" (OuterVolumeSpecName: "kube-api-access-hhp6t") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "kube-api-access-hhp6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.036789 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "92e16305-ea70-49fd-b269-0e36792ee6ea" (UID: "92e16305-ea70-49fd-b269-0e36792ee6ea"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.116821 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-audit-policies\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.116888 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5287c\" (UniqueName: \"kubernetes.io/projected/581e76bb-c0c0-40eb-990f-e3a94fedec65-kube-api-access-5287c\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.116914 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-error\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.116940 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.116970 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117002 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117042 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117070 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117789 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/581e76bb-c0c0-40eb-990f-e3a94fedec65-audit-dir\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117818 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-session\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117828 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/581e76bb-c0c0-40eb-990f-e3a94fedec65-audit-dir\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117841 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117864 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117888 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117939 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-login\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117781 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-audit-policies\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117982 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.117995 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118008 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118019 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118030 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118039 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhp6t\" (UniqueName: \"kubernetes.io/projected/92e16305-ea70-49fd-b269-0e36792ee6ea-kube-api-access-hhp6t\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118049 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118058 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118068 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118079 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118090 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118087 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-service-ca\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118100 4882 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/92e16305-ea70-49fd-b269-0e36792ee6ea-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118173 4882 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/92e16305-ea70-49fd-b269-0e36792ee6ea-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118344 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.118540 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.120818 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.121012 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-error\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.123481 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-login\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.123800 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.123972 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.124071 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-session\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.124726 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-system-router-certs\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.126489 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/581e76bb-c0c0-40eb-990f-e3a94fedec65-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.132976 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5287c\" (UniqueName: \"kubernetes.io/projected/581e76bb-c0c0-40eb-990f-e3a94fedec65-kube-api-access-5287c\") pod \"oauth-openshift-69bcbbd7f8-nrxpg\" (UID: \"581e76bb-c0c0-40eb-990f-e3a94fedec65\") " pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.207609 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.429923 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg"] Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.435708 4882 generic.go:334] "Generic (PLEG): container finished" podID="92e16305-ea70-49fd-b269-0e36792ee6ea" containerID="94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836" exitCode=0 Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.435766 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.435776 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" event={"ID":"92e16305-ea70-49fd-b269-0e36792ee6ea","Type":"ContainerDied","Data":"94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836"} Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.435819 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wcjdw" event={"ID":"92e16305-ea70-49fd-b269-0e36792ee6ea","Type":"ContainerDied","Data":"4d20d602d3a38e3a4a8ae883375f2437c8415b6593fa26116362b5db3536438c"} Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.435842 4882 scope.go:117] "RemoveContainer" containerID="94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.467963 4882 scope.go:117] "RemoveContainer" containerID="94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.470141 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wcjdw"] Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.473782 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wcjdw"] Oct 02 16:21:48 crc kubenswrapper[4882]: E1002 16:21:48.474049 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836\": container with ID starting with 94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836 not found: ID does not exist" containerID="94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.474112 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836"} err="failed to get container status \"94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836\": rpc error: code = NotFound desc = could not find container \"94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836\": container with ID starting with 94fc5a8dd68e36c9845d98f5c696f2d1d78ac13a5f4e1ece499f466ec7607836 not found: ID does not exist" Oct 02 16:21:48 crc kubenswrapper[4882]: I1002 16:21:48.767626 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e16305-ea70-49fd-b269-0e36792ee6ea" path="/var/lib/kubelet/pods/92e16305-ea70-49fd-b269-0e36792ee6ea/volumes" Oct 02 16:21:49 crc kubenswrapper[4882]: I1002 16:21:49.442974 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" event={"ID":"581e76bb-c0c0-40eb-990f-e3a94fedec65","Type":"ContainerStarted","Data":"e63e353a34d1a9b0cf3c1bc01b416cb52c069e98c748c98abdcafa2cddbcdf46"} Oct 02 16:21:49 crc kubenswrapper[4882]: I1002 16:21:49.443032 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" event={"ID":"581e76bb-c0c0-40eb-990f-e3a94fedec65","Type":"ContainerStarted","Data":"83065d704faae77a88f2a7883b41a7562918ec480a03565a0fe8cef6f7930b91"} Oct 02 16:21:49 crc kubenswrapper[4882]: I1002 16:21:49.443907 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:49 crc kubenswrapper[4882]: I1002 16:21:49.450069 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" Oct 02 16:21:49 crc kubenswrapper[4882]: I1002 16:21:49.471907 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-69bcbbd7f8-nrxpg" podStartSLOduration=27.471882107 podStartE2EDuration="27.471882107s" podCreationTimestamp="2025-10-02 16:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:21:49.465894589 +0000 UTC m=+268.215124126" watchObservedRunningTime="2025-10-02 16:21:49.471882107 +0000 UTC m=+268.221111634" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.450269 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x8p45"] Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.451446 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x8p45" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerName="registry-server" containerID="cri-o://59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef" gracePeriod=30 Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.478847 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dn4hr"] Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.479236 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dn4hr" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="registry-server" containerID="cri-o://baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2" gracePeriod=30 Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.486427 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d972j"] Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.486683 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" podUID="53105ebf-8ac0-401a-8c49-b6c4780082e5" containerName="marketplace-operator" containerID="cri-o://2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9" gracePeriod=30 Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.500685 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42mqg"] Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.501336 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-42mqg" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerName="registry-server" containerID="cri-o://37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471" gracePeriod=30 Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.507280 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvdff"] Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.507640 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bvdff" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="registry-server" containerID="cri-o://f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3" gracePeriod=30 Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.528655 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xfg74"] Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.529938 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.539865 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xfg74"] Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.581466 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.581553 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.581578 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4frv\" (UniqueName: \"kubernetes.io/projected/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-kube-api-access-g4frv\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.684626 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.684676 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4frv\" (UniqueName: \"kubernetes.io/projected/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-kube-api-access-g4frv\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.684752 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.687859 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: E1002 16:22:05.693438 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3 is running failed: container process not found" containerID="f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 16:22:05 crc kubenswrapper[4882]: E1002 16:22:05.694413 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3 is running failed: container process not found" containerID="f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 16:22:05 crc kubenswrapper[4882]: E1002 16:22:05.694709 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3 is running failed: container process not found" containerID="f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 16:22:05 crc kubenswrapper[4882]: E1002 16:22:05.694747 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-bvdff" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="registry-server" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.705560 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.709497 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4frv\" (UniqueName: \"kubernetes.io/projected/9ce3e7a3-a9c7-420b-a4df-d0445f4b3651-kube-api-access-g4frv\") pod \"marketplace-operator-79b997595-xfg74\" (UID: \"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651\") " pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: E1002 16:22:05.807407 4882 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffca683c_4413_46fe_a196_d44d174991bf.slice/crio-conmon-f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffca683c_4413_46fe_a196_d44d174991bf.slice/crio-f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f73bba_eae3_4e14_8cc0_2d4fe743b0d3.slice/crio-59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdf5160b_ad14_4521_9a46_ff205e5a2cd1.slice/crio-conmon-37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f73bba_eae3_4e14_8cc0_2d4fe743b0d3.slice/crio-conmon-59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef.scope\": RecentStats: unable to find data in memory cache]" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.851841 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:05 crc kubenswrapper[4882]: I1002 16:22:05.935711 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.075121 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.083537 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.088763 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-catalog-content\") pod \"9712373b-49f5-4e5e-9309-2031e7a680fa\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.089088 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgzwz\" (UniqueName: \"kubernetes.io/projected/9712373b-49f5-4e5e-9309-2031e7a680fa-kube-api-access-mgzwz\") pod \"9712373b-49f5-4e5e-9309-2031e7a680fa\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.089119 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xblmm\" (UniqueName: \"kubernetes.io/projected/ffca683c-4413-46fe-a196-d44d174991bf-kube-api-access-xblmm\") pod \"ffca683c-4413-46fe-a196-d44d174991bf\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.089139 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-utilities\") pod \"ffca683c-4413-46fe-a196-d44d174991bf\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.089158 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics\") pod \"53105ebf-8ac0-401a-8c49-b6c4780082e5\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.089188 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-utilities\") pod \"9712373b-49f5-4e5e-9309-2031e7a680fa\" (UID: \"9712373b-49f5-4e5e-9309-2031e7a680fa\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.089244 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-catalog-content\") pod \"ffca683c-4413-46fe-a196-d44d174991bf\" (UID: \"ffca683c-4413-46fe-a196-d44d174991bf\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.089277 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6f5t\" (UniqueName: \"kubernetes.io/projected/53105ebf-8ac0-401a-8c49-b6c4780082e5-kube-api-access-r6f5t\") pod \"53105ebf-8ac0-401a-8c49-b6c4780082e5\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.089301 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca\") pod \"53105ebf-8ac0-401a-8c49-b6c4780082e5\" (UID: \"53105ebf-8ac0-401a-8c49-b6c4780082e5\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.090243 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "53105ebf-8ac0-401a-8c49-b6c4780082e5" (UID: "53105ebf-8ac0-401a-8c49-b6c4780082e5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.091680 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-utilities" (OuterVolumeSpecName: "utilities") pod "ffca683c-4413-46fe-a196-d44d174991bf" (UID: "ffca683c-4413-46fe-a196-d44d174991bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.092192 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-utilities" (OuterVolumeSpecName: "utilities") pod "9712373b-49f5-4e5e-9309-2031e7a680fa" (UID: "9712373b-49f5-4e5e-9309-2031e7a680fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.104812 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53105ebf-8ac0-401a-8c49-b6c4780082e5-kube-api-access-r6f5t" (OuterVolumeSpecName: "kube-api-access-r6f5t") pod "53105ebf-8ac0-401a-8c49-b6c4780082e5" (UID: "53105ebf-8ac0-401a-8c49-b6c4780082e5"). InnerVolumeSpecName "kube-api-access-r6f5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.114185 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.114878 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffca683c-4413-46fe-a196-d44d174991bf-kube-api-access-xblmm" (OuterVolumeSpecName: "kube-api-access-xblmm") pod "ffca683c-4413-46fe-a196-d44d174991bf" (UID: "ffca683c-4413-46fe-a196-d44d174991bf"). InnerVolumeSpecName "kube-api-access-xblmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.115853 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9712373b-49f5-4e5e-9309-2031e7a680fa-kube-api-access-mgzwz" (OuterVolumeSpecName: "kube-api-access-mgzwz") pod "9712373b-49f5-4e5e-9309-2031e7a680fa" (UID: "9712373b-49f5-4e5e-9309-2031e7a680fa"). InnerVolumeSpecName "kube-api-access-mgzwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.142567 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "53105ebf-8ac0-401a-8c49-b6c4780082e5" (UID: "53105ebf-8ac0-401a-8c49-b6c4780082e5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.187343 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9712373b-49f5-4e5e-9309-2031e7a680fa" (UID: "9712373b-49f5-4e5e-9309-2031e7a680fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.189692 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5bxs\" (UniqueName: \"kubernetes.io/projected/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-kube-api-access-z5bxs\") pod \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.189850 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-catalog-content\") pod \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.190054 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-utilities\") pod \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\" (UID: \"bdf5160b-ad14-4521-9a46-ff205e5a2cd1\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.190360 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgzwz\" (UniqueName: \"kubernetes.io/projected/9712373b-49f5-4e5e-9309-2031e7a680fa-kube-api-access-mgzwz\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.190539 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xblmm\" (UniqueName: \"kubernetes.io/projected/ffca683c-4413-46fe-a196-d44d174991bf-kube-api-access-xblmm\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.190629 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.190711 4882 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.190799 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.190884 4882 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53105ebf-8ac0-401a-8c49-b6c4780082e5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.190972 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6f5t\" (UniqueName: \"kubernetes.io/projected/53105ebf-8ac0-401a-8c49-b6c4780082e5-kube-api-access-r6f5t\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.191050 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9712373b-49f5-4e5e-9309-2031e7a680fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.192261 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-utilities" (OuterVolumeSpecName: "utilities") pod "bdf5160b-ad14-4521-9a46-ff205e5a2cd1" (UID: "bdf5160b-ad14-4521-9a46-ff205e5a2cd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.196100 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-kube-api-access-z5bxs" (OuterVolumeSpecName: "kube-api-access-z5bxs") pod "bdf5160b-ad14-4521-9a46-ff205e5a2cd1" (UID: "bdf5160b-ad14-4521-9a46-ff205e5a2cd1"). InnerVolumeSpecName "kube-api-access-z5bxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.196559 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffca683c-4413-46fe-a196-d44d174991bf" (UID: "ffca683c-4413-46fe-a196-d44d174991bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.209712 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdf5160b-ad14-4521-9a46-ff205e5a2cd1" (UID: "bdf5160b-ad14-4521-9a46-ff205e5a2cd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.292154 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5bxs\" (UniqueName: \"kubernetes.io/projected/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-kube-api-access-z5bxs\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.292184 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.292198 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf5160b-ad14-4521-9a46-ff205e5a2cd1-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.292211 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca683c-4413-46fe-a196-d44d174991bf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.390682 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xfg74"] Oct 02 16:22:06 crc kubenswrapper[4882]: W1002 16:22:06.398956 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce3e7a3_a9c7_420b_a4df_d0445f4b3651.slice/crio-f5d73fb3f91fbb212d9abc6e3d7f004be9fcbdcc5718eba0c94cb143e1178f35 WatchSource:0}: Error finding container f5d73fb3f91fbb212d9abc6e3d7f004be9fcbdcc5718eba0c94cb143e1178f35: Status 404 returned error can't find the container with id f5d73fb3f91fbb212d9abc6e3d7f004be9fcbdcc5718eba0c94cb143e1178f35 Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.420316 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.511994 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-utilities\") pod \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.512125 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgjgn\" (UniqueName: \"kubernetes.io/projected/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-kube-api-access-tgjgn\") pod \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.512249 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-catalog-content\") pod \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\" (UID: \"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3\") " Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.512864 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-utilities" (OuterVolumeSpecName: "utilities") pod "f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" (UID: "f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.521381 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-kube-api-access-tgjgn" (OuterVolumeSpecName: "kube-api-access-tgjgn") pod "f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" (UID: "f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3"). InnerVolumeSpecName "kube-api-access-tgjgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.552912 4882 generic.go:334] "Generic (PLEG): container finished" podID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerID="baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2" exitCode=0 Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.552994 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4hr" event={"ID":"9712373b-49f5-4e5e-9309-2031e7a680fa","Type":"ContainerDied","Data":"baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.553033 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn4hr" event={"ID":"9712373b-49f5-4e5e-9309-2031e7a680fa","Type":"ContainerDied","Data":"d20592e6c92d42472cd773eca00f0d8ab8c5975e05f3e768048ebcdeccfc37af"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.553055 4882 scope.go:117] "RemoveContainer" containerID="baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.553210 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn4hr" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.557645 4882 generic.go:334] "Generic (PLEG): container finished" podID="ffca683c-4413-46fe-a196-d44d174991bf" containerID="f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3" exitCode=0 Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.557711 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvdff" event={"ID":"ffca683c-4413-46fe-a196-d44d174991bf","Type":"ContainerDied","Data":"f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.557745 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvdff" event={"ID":"ffca683c-4413-46fe-a196-d44d174991bf","Type":"ContainerDied","Data":"e19e5bd77d7771abaa10368415e23fcd07bf2b8bb8d8d652ce5caf38922a4781"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.557838 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvdff" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.560890 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" event={"ID":"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651","Type":"ContainerStarted","Data":"e3d23b429236ba77ef13edf20e7f637d24de9cca8e96ef74d81e453cbf69c2dc"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.560995 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" event={"ID":"9ce3e7a3-a9c7-420b-a4df-d0445f4b3651","Type":"ContainerStarted","Data":"f5d73fb3f91fbb212d9abc6e3d7f004be9fcbdcc5718eba0c94cb143e1178f35"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.568379 4882 generic.go:334] "Generic (PLEG): container finished" podID="53105ebf-8ac0-401a-8c49-b6c4780082e5" containerID="2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9" exitCode=0 Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.568681 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" event={"ID":"53105ebf-8ac0-401a-8c49-b6c4780082e5","Type":"ContainerDied","Data":"2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.568729 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" event={"ID":"53105ebf-8ac0-401a-8c49-b6c4780082e5","Type":"ContainerDied","Data":"8a94541c4895bb811e8ea4d5676e28d771455e87c3a131339d45073381f072be"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.568968 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d972j" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.572954 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" (UID: "f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.577110 4882 generic.go:334] "Generic (PLEG): container finished" podID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerID="37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471" exitCode=0 Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.577249 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42mqg" event={"ID":"bdf5160b-ad14-4521-9a46-ff205e5a2cd1","Type":"ContainerDied","Data":"37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.577295 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42mqg" event={"ID":"bdf5160b-ad14-4521-9a46-ff205e5a2cd1","Type":"ContainerDied","Data":"a35fe7e1af2e0ed997ca4e1b7c4ee1b8dcc03a7c9f576c9664c62067eb83bc3c"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.577688 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42mqg" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.580594 4882 generic.go:334] "Generic (PLEG): container finished" podID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerID="59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef" exitCode=0 Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.580747 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8p45" event={"ID":"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3","Type":"ContainerDied","Data":"59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.580869 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8p45" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.580965 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8p45" event={"ID":"f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3","Type":"ContainerDied","Data":"c098735f392da036bbe8de813de8c781623839d5a66c53ca02115f601538e8d1"} Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.585676 4882 scope.go:117] "RemoveContainer" containerID="b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.599008 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dn4hr"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.611012 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dn4hr"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.614328 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.614370 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgjgn\" (UniqueName: \"kubernetes.io/projected/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-kube-api-access-tgjgn\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.614384 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.632528 4882 scope.go:117] "RemoveContainer" containerID="d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.653183 4882 scope.go:117] "RemoveContainer" containerID="baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.654325 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2\": container with ID starting with baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2 not found: ID does not exist" containerID="baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.654362 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2"} err="failed to get container status \"baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2\": rpc error: code = NotFound desc = could not find container \"baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2\": container with ID starting with baabadedaf5c9ec8fb6913c6a187254e7cd1f82e36d0ad8bab7e8a275aee50f2 not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.654388 4882 scope.go:117] "RemoveContainer" containerID="b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.654878 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd\": container with ID starting with b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd not found: ID does not exist" containerID="b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.654925 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd"} err="failed to get container status \"b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd\": rpc error: code = NotFound desc = could not find container \"b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd\": container with ID starting with b1c5e7aa4d90dd4b4d5b2c52830c772a6ca4632e34c5d4e87a8705846bb8c9bd not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.654957 4882 scope.go:117] "RemoveContainer" containerID="d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.655296 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d\": container with ID starting with d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d not found: ID does not exist" containerID="d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.655322 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d"} err="failed to get container status \"d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d\": rpc error: code = NotFound desc = could not find container \"d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d\": container with ID starting with d3e68067e9ecee0e8ddd7ca7120cfece8cc1657a01b74639359e0425ecfe324d not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.655340 4882 scope.go:117] "RemoveContainer" containerID="f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.676379 4882 scope.go:117] "RemoveContainer" containerID="ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.681943 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x8p45"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.686740 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x8p45"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.695054 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvdff"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.702531 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bvdff"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.705418 4882 scope.go:117] "RemoveContainer" containerID="92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.709614 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d972j"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.712767 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d972j"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.720832 4882 scope.go:117] "RemoveContainer" containerID="f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.721460 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3\": container with ID starting with f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3 not found: ID does not exist" containerID="f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.721506 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3"} err="failed to get container status \"f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3\": rpc error: code = NotFound desc = could not find container \"f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3\": container with ID starting with f8351be3850742e3a1f22cda4a28a6e5cba18036be61208d78d652c9c0d566f3 not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.721552 4882 scope.go:117] "RemoveContainer" containerID="ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.721624 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42mqg"] Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.721779 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98\": container with ID starting with ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98 not found: ID does not exist" containerID="ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.721849 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98"} err="failed to get container status \"ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98\": rpc error: code = NotFound desc = could not find container \"ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98\": container with ID starting with ee8ec994158d768daaa6775e8fba76706fbd4c1e0cafd745a2c5ea3dd0ad6e98 not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.721867 4882 scope.go:117] "RemoveContainer" containerID="92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.722259 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8\": container with ID starting with 92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8 not found: ID does not exist" containerID="92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.722286 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8"} err="failed to get container status \"92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8\": rpc error: code = NotFound desc = could not find container \"92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8\": container with ID starting with 92d31b88f00507b376904f250e1e2fa5c3fcf3a6ade1ccb3eb1ccc35e98f08c8 not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.722300 4882 scope.go:117] "RemoveContainer" containerID="2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.725538 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-42mqg"] Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.735599 4882 scope.go:117] "RemoveContainer" containerID="2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.736201 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9\": container with ID starting with 2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9 not found: ID does not exist" containerID="2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.736515 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9"} err="failed to get container status \"2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9\": rpc error: code = NotFound desc = could not find container \"2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9\": container with ID starting with 2d27d2e23cd5c869547754d33b59c567d9eeceb5faae617370621f5748f41ba9 not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.736550 4882 scope.go:117] "RemoveContainer" containerID="37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.751367 4882 scope.go:117] "RemoveContainer" containerID="93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.771452 4882 scope.go:117] "RemoveContainer" containerID="3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.775567 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53105ebf-8ac0-401a-8c49-b6c4780082e5" path="/var/lib/kubelet/pods/53105ebf-8ac0-401a-8c49-b6c4780082e5/volumes" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.776286 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" path="/var/lib/kubelet/pods/9712373b-49f5-4e5e-9309-2031e7a680fa/volumes" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.776969 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" path="/var/lib/kubelet/pods/bdf5160b-ad14-4521-9a46-ff205e5a2cd1/volumes" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.778015 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" path="/var/lib/kubelet/pods/f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3/volumes" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.778598 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffca683c-4413-46fe-a196-d44d174991bf" path="/var/lib/kubelet/pods/ffca683c-4413-46fe-a196-d44d174991bf/volumes" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.795267 4882 scope.go:117] "RemoveContainer" containerID="37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.796247 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471\": container with ID starting with 37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471 not found: ID does not exist" containerID="37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.796311 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471"} err="failed to get container status \"37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471\": rpc error: code = NotFound desc = could not find container \"37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471\": container with ID starting with 37d5957bb68dc30172aa6f7ab75db0f3fa7f2edcf93e488719ce05b6adda3471 not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.796360 4882 scope.go:117] "RemoveContainer" containerID="93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.796912 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c\": container with ID starting with 93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c not found: ID does not exist" containerID="93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.796960 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c"} err="failed to get container status \"93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c\": rpc error: code = NotFound desc = could not find container \"93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c\": container with ID starting with 93398106ced757c11c7a1223ca93246f7f7bbacac1775acf763afa61bd51f77c not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.796991 4882 scope.go:117] "RemoveContainer" containerID="3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.797312 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021\": container with ID starting with 3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021 not found: ID does not exist" containerID="3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.797339 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021"} err="failed to get container status \"3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021\": rpc error: code = NotFound desc = could not find container \"3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021\": container with ID starting with 3c50aa1e0b0b2b34dda725845d1eb3b186140793b72e08a0c5731045ae4bd021 not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.797356 4882 scope.go:117] "RemoveContainer" containerID="59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.816721 4882 scope.go:117] "RemoveContainer" containerID="a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.840189 4882 scope.go:117] "RemoveContainer" containerID="12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.863731 4882 scope.go:117] "RemoveContainer" containerID="59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.864788 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef\": container with ID starting with 59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef not found: ID does not exist" containerID="59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.864860 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef"} err="failed to get container status \"59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef\": rpc error: code = NotFound desc = could not find container \"59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef\": container with ID starting with 59dce42c3d01c063e226632ef812378d583398105237b90032245b2c62230bef not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.864941 4882 scope.go:117] "RemoveContainer" containerID="a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.865698 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603\": container with ID starting with a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603 not found: ID does not exist" containerID="a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.866172 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603"} err="failed to get container status \"a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603\": rpc error: code = NotFound desc = could not find container \"a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603\": container with ID starting with a3ec079f97ef6e429c3960d7e28ef37ddf6c5843a8e504ee42dd5ce9c8801603 not found: ID does not exist" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.866193 4882 scope.go:117] "RemoveContainer" containerID="12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e" Oct 02 16:22:06 crc kubenswrapper[4882]: E1002 16:22:06.867809 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e\": container with ID starting with 12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e not found: ID does not exist" containerID="12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e" Oct 02 16:22:06 crc kubenswrapper[4882]: I1002 16:22:06.867867 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e"} err="failed to get container status \"12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e\": rpc error: code = NotFound desc = could not find container \"12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e\": container with ID starting with 12fb51a1bac40a667c860d27fecfcc72728e6b22f7f17602802d049b1b7f694e not found: ID does not exist" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.598094 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.602597 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.641494 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xfg74" podStartSLOduration=2.6414716929999997 podStartE2EDuration="2.641471693s" podCreationTimestamp="2025-10-02 16:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:22:07.623544071 +0000 UTC m=+286.372773608" watchObservedRunningTime="2025-10-02 16:22:07.641471693 +0000 UTC m=+286.390701220" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.686430 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hfrrl"] Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.686818 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53105ebf-8ac0-401a-8c49-b6c4780082e5" containerName="marketplace-operator" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.686838 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="53105ebf-8ac0-401a-8c49-b6c4780082e5" containerName="marketplace-operator" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.686862 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerName="extract-content" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.686871 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerName="extract-content" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.686882 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.686892 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.686906 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerName="extract-content" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.686915 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerName="extract-content" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.686926 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="extract-utilities" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.686934 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="extract-utilities" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.686946 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="extract-content" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.686955 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="extract-content" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.686972 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerName="extract-utilities" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.686980 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerName="extract-utilities" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.686993 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="extract-utilities" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687003 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="extract-utilities" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.687012 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerName="extract-utilities" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687020 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerName="extract-utilities" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.687030 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="extract-content" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687037 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="extract-content" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.687049 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687057 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.687071 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687099 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: E1002 16:22:07.687113 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687121 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687316 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9712373b-49f5-4e5e-9309-2031e7a680fa" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687335 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf5160b-ad14-4521-9a46-ff205e5a2cd1" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687347 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f73bba-eae3-4e14-8cc0-2d4fe743b0d3" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687359 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffca683c-4413-46fe-a196-d44d174991bf" containerName="registry-server" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.687368 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="53105ebf-8ac0-401a-8c49-b6c4780082e5" containerName="marketplace-operator" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.688464 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.694805 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.698413 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfrrl"] Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.733375 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-catalog-content\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.733468 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhbpg\" (UniqueName: \"kubernetes.io/projected/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-kube-api-access-nhbpg\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.733517 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-utilities\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.834966 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-catalog-content\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.835023 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhbpg\" (UniqueName: \"kubernetes.io/projected/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-kube-api-access-nhbpg\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.835063 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-utilities\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.835640 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-utilities\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.836007 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-catalog-content\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.857622 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhbpg\" (UniqueName: \"kubernetes.io/projected/6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb-kube-api-access-nhbpg\") pod \"redhat-marketplace-hfrrl\" (UID: \"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb\") " pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.884641 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zlgt4"] Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.885678 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.888299 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.907591 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zlgt4"] Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.936790 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlztv\" (UniqueName: \"kubernetes.io/projected/4a42aadc-015c-40ed-9624-f6417e4e68fc-kube-api-access-hlztv\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.936909 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a42aadc-015c-40ed-9624-f6417e4e68fc-utilities\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:07 crc kubenswrapper[4882]: I1002 16:22:07.936969 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a42aadc-015c-40ed-9624-f6417e4e68fc-catalog-content\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.006666 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.038298 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlztv\" (UniqueName: \"kubernetes.io/projected/4a42aadc-015c-40ed-9624-f6417e4e68fc-kube-api-access-hlztv\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.038398 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a42aadc-015c-40ed-9624-f6417e4e68fc-utilities\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.038443 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a42aadc-015c-40ed-9624-f6417e4e68fc-catalog-content\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.039013 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a42aadc-015c-40ed-9624-f6417e4e68fc-catalog-content\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.039880 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a42aadc-015c-40ed-9624-f6417e4e68fc-utilities\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.061385 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlztv\" (UniqueName: \"kubernetes.io/projected/4a42aadc-015c-40ed-9624-f6417e4e68fc-kube-api-access-hlztv\") pod \"redhat-operators-zlgt4\" (UID: \"4a42aadc-015c-40ed-9624-f6417e4e68fc\") " pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.205160 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.465877 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfrrl"] Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.604272 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfrrl" event={"ID":"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb","Type":"ContainerStarted","Data":"b418b4ec4c553c65a4cf4d7dfdb346796fc59bdf280906c7881b9b422da5bdef"} Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.604610 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfrrl" event={"ID":"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb","Type":"ContainerStarted","Data":"af45dc36988c16aa6d143188341315e0ce45e4132fb256ee26724d09048a1dc4"} Oct 02 16:22:08 crc kubenswrapper[4882]: I1002 16:22:08.620089 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zlgt4"] Oct 02 16:22:08 crc kubenswrapper[4882]: W1002 16:22:08.658164 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a42aadc_015c_40ed_9624_f6417e4e68fc.slice/crio-dcba8f8f14816eecc6a1155f4a4cdbafbb3f00dadf45d35214eab51d117f6bde WatchSource:0}: Error finding container dcba8f8f14816eecc6a1155f4a4cdbafbb3f00dadf45d35214eab51d117f6bde: Status 404 returned error can't find the container with id dcba8f8f14816eecc6a1155f4a4cdbafbb3f00dadf45d35214eab51d117f6bde Oct 02 16:22:09 crc kubenswrapper[4882]: I1002 16:22:09.611626 4882 generic.go:334] "Generic (PLEG): container finished" podID="4a42aadc-015c-40ed-9624-f6417e4e68fc" containerID="055a377074a29d0d7ca2fc3305f5d846aac9db9749135f79340c775ec1a643ca" exitCode=0 Oct 02 16:22:09 crc kubenswrapper[4882]: I1002 16:22:09.611748 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlgt4" event={"ID":"4a42aadc-015c-40ed-9624-f6417e4e68fc","Type":"ContainerDied","Data":"055a377074a29d0d7ca2fc3305f5d846aac9db9749135f79340c775ec1a643ca"} Oct 02 16:22:09 crc kubenswrapper[4882]: I1002 16:22:09.611883 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlgt4" event={"ID":"4a42aadc-015c-40ed-9624-f6417e4e68fc","Type":"ContainerStarted","Data":"dcba8f8f14816eecc6a1155f4a4cdbafbb3f00dadf45d35214eab51d117f6bde"} Oct 02 16:22:09 crc kubenswrapper[4882]: I1002 16:22:09.613702 4882 generic.go:334] "Generic (PLEG): container finished" podID="6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb" containerID="b418b4ec4c553c65a4cf4d7dfdb346796fc59bdf280906c7881b9b422da5bdef" exitCode=0 Oct 02 16:22:09 crc kubenswrapper[4882]: I1002 16:22:09.614688 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfrrl" event={"ID":"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb","Type":"ContainerDied","Data":"b418b4ec4c553c65a4cf4d7dfdb346796fc59bdf280906c7881b9b422da5bdef"} Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.082524 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2qnvl"] Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.083911 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.087376 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.094010 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qnvl"] Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.179773 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdb4f07-c013-4677-86ec-bdda64483def-catalog-content\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.179896 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt2zv\" (UniqueName: \"kubernetes.io/projected/efdb4f07-c013-4677-86ec-bdda64483def-kube-api-access-nt2zv\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.180001 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdb4f07-c013-4677-86ec-bdda64483def-utilities\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.278464 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5wm7q"] Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.280673 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt2zv\" (UniqueName: \"kubernetes.io/projected/efdb4f07-c013-4677-86ec-bdda64483def-kube-api-access-nt2zv\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.280808 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdb4f07-c013-4677-86ec-bdda64483def-utilities\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.282980 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdb4f07-c013-4677-86ec-bdda64483def-catalog-content\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.282598 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.284278 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wm7q"] Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.284460 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdb4f07-c013-4677-86ec-bdda64483def-catalog-content\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.284613 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdb4f07-c013-4677-86ec-bdda64483def-utilities\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.285076 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.318292 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt2zv\" (UniqueName: \"kubernetes.io/projected/efdb4f07-c013-4677-86ec-bdda64483def-kube-api-access-nt2zv\") pod \"certified-operators-2qnvl\" (UID: \"efdb4f07-c013-4677-86ec-bdda64483def\") " pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.384438 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdktd\" (UniqueName: \"kubernetes.io/projected/b276f479-6426-4f8f-acd1-9823370944e2-kube-api-access-gdktd\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.385072 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b276f479-6426-4f8f-acd1-9823370944e2-utilities\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.385131 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b276f479-6426-4f8f-acd1-9823370944e2-catalog-content\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.407189 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.486511 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b276f479-6426-4f8f-acd1-9823370944e2-utilities\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.487051 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b276f479-6426-4f8f-acd1-9823370944e2-catalog-content\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.487136 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdktd\" (UniqueName: \"kubernetes.io/projected/b276f479-6426-4f8f-acd1-9823370944e2-kube-api-access-gdktd\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.487200 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b276f479-6426-4f8f-acd1-9823370944e2-utilities\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.487477 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b276f479-6426-4f8f-acd1-9823370944e2-catalog-content\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.520633 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdktd\" (UniqueName: \"kubernetes.io/projected/b276f479-6426-4f8f-acd1-9823370944e2-kube-api-access-gdktd\") pod \"community-operators-5wm7q\" (UID: \"b276f479-6426-4f8f-acd1-9823370944e2\") " pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.609941 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.625035 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlgt4" event={"ID":"4a42aadc-015c-40ed-9624-f6417e4e68fc","Type":"ContainerStarted","Data":"16381f1ebc87aace4b9ee4772ba6ecab99a134c3404a749163958c218c67dd4c"} Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.628147 4882 generic.go:334] "Generic (PLEG): container finished" podID="6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb" containerID="8f5f3cf44ca43608476ae01d2638c632019251b3f0abc74b8717d349bf49a3bd" exitCode=0 Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.628205 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfrrl" event={"ID":"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb","Type":"ContainerDied","Data":"8f5f3cf44ca43608476ae01d2638c632019251b3f0abc74b8717d349bf49a3bd"} Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.813048 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wm7q"] Oct 02 16:22:10 crc kubenswrapper[4882]: I1002 16:22:10.826158 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qnvl"] Oct 02 16:22:10 crc kubenswrapper[4882]: W1002 16:22:10.831852 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefdb4f07_c013_4677_86ec_bdda64483def.slice/crio-974cea36917d64a0f33d24da52065ac152759c52bdcc0c0a0ba74079f7e0ecf6 WatchSource:0}: Error finding container 974cea36917d64a0f33d24da52065ac152759c52bdcc0c0a0ba74079f7e0ecf6: Status 404 returned error can't find the container with id 974cea36917d64a0f33d24da52065ac152759c52bdcc0c0a0ba74079f7e0ecf6 Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.636145 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfrrl" event={"ID":"6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb","Type":"ContainerStarted","Data":"be00c8c591a5d5cf4c1903049ebc1e5c1f5a79854624ae4284474997d804bfa5"} Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.638263 4882 generic.go:334] "Generic (PLEG): container finished" podID="b276f479-6426-4f8f-acd1-9823370944e2" containerID="db33fcf58649409dcd2b86d97fe9536ebc3a4e9267813748461168177862c180" exitCode=0 Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.638335 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wm7q" event={"ID":"b276f479-6426-4f8f-acd1-9823370944e2","Type":"ContainerDied","Data":"db33fcf58649409dcd2b86d97fe9536ebc3a4e9267813748461168177862c180"} Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.638397 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wm7q" event={"ID":"b276f479-6426-4f8f-acd1-9823370944e2","Type":"ContainerStarted","Data":"175d07e1079fb5189962c71475adc972272a25aebf898ef22affab58a8eb6281"} Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.640205 4882 generic.go:334] "Generic (PLEG): container finished" podID="efdb4f07-c013-4677-86ec-bdda64483def" containerID="263b9ee2cff0425fe5ad7752c725580e619fe2b608e951ebff166f871c3b1e72" exitCode=0 Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.640278 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qnvl" event={"ID":"efdb4f07-c013-4677-86ec-bdda64483def","Type":"ContainerDied","Data":"263b9ee2cff0425fe5ad7752c725580e619fe2b608e951ebff166f871c3b1e72"} Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.640312 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qnvl" event={"ID":"efdb4f07-c013-4677-86ec-bdda64483def","Type":"ContainerStarted","Data":"974cea36917d64a0f33d24da52065ac152759c52bdcc0c0a0ba74079f7e0ecf6"} Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.642462 4882 generic.go:334] "Generic (PLEG): container finished" podID="4a42aadc-015c-40ed-9624-f6417e4e68fc" containerID="16381f1ebc87aace4b9ee4772ba6ecab99a134c3404a749163958c218c67dd4c" exitCode=0 Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.642537 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlgt4" event={"ID":"4a42aadc-015c-40ed-9624-f6417e4e68fc","Type":"ContainerDied","Data":"16381f1ebc87aace4b9ee4772ba6ecab99a134c3404a749163958c218c67dd4c"} Oct 02 16:22:11 crc kubenswrapper[4882]: I1002 16:22:11.662765 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hfrrl" podStartSLOduration=2.994642148 podStartE2EDuration="4.662729521s" podCreationTimestamp="2025-10-02 16:22:07 +0000 UTC" firstStartedPulling="2025-10-02 16:22:09.61570351 +0000 UTC m=+288.364933037" lastFinishedPulling="2025-10-02 16:22:11.283790883 +0000 UTC m=+290.033020410" observedRunningTime="2025-10-02 16:22:11.657524402 +0000 UTC m=+290.406753949" watchObservedRunningTime="2025-10-02 16:22:11.662729521 +0000 UTC m=+290.411959058" Oct 02 16:22:12 crc kubenswrapper[4882]: I1002 16:22:12.651324 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlgt4" event={"ID":"4a42aadc-015c-40ed-9624-f6417e4e68fc","Type":"ContainerStarted","Data":"0457c1c80dd80f8d45a4671184bc2adb239155b35dd785cf78857958489d952c"} Oct 02 16:22:12 crc kubenswrapper[4882]: I1002 16:22:12.653656 4882 generic.go:334] "Generic (PLEG): container finished" podID="b276f479-6426-4f8f-acd1-9823370944e2" containerID="99055efe031d15a4425ee84ef008ce19f735d4e2ff65498b49f7cc568013b601" exitCode=0 Oct 02 16:22:12 crc kubenswrapper[4882]: I1002 16:22:12.653721 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wm7q" event={"ID":"b276f479-6426-4f8f-acd1-9823370944e2","Type":"ContainerDied","Data":"99055efe031d15a4425ee84ef008ce19f735d4e2ff65498b49f7cc568013b601"} Oct 02 16:22:12 crc kubenswrapper[4882]: I1002 16:22:12.656391 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qnvl" event={"ID":"efdb4f07-c013-4677-86ec-bdda64483def","Type":"ContainerStarted","Data":"0feeb1d9353b5e6acdd03f97ef0104d0a79f9368584123c5c568e88ce5f00870"} Oct 02 16:22:12 crc kubenswrapper[4882]: I1002 16:22:12.670722 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zlgt4" podStartSLOduration=3.131637222 podStartE2EDuration="5.670695967s" podCreationTimestamp="2025-10-02 16:22:07 +0000 UTC" firstStartedPulling="2025-10-02 16:22:09.614174062 +0000 UTC m=+288.363403589" lastFinishedPulling="2025-10-02 16:22:12.153232807 +0000 UTC m=+290.902462334" observedRunningTime="2025-10-02 16:22:12.669403977 +0000 UTC m=+291.418633524" watchObservedRunningTime="2025-10-02 16:22:12.670695967 +0000 UTC m=+291.419925494" Oct 02 16:22:13 crc kubenswrapper[4882]: I1002 16:22:13.664623 4882 generic.go:334] "Generic (PLEG): container finished" podID="efdb4f07-c013-4677-86ec-bdda64483def" containerID="0feeb1d9353b5e6acdd03f97ef0104d0a79f9368584123c5c568e88ce5f00870" exitCode=0 Oct 02 16:22:13 crc kubenswrapper[4882]: I1002 16:22:13.664717 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qnvl" event={"ID":"efdb4f07-c013-4677-86ec-bdda64483def","Type":"ContainerDied","Data":"0feeb1d9353b5e6acdd03f97ef0104d0a79f9368584123c5c568e88ce5f00870"} Oct 02 16:22:14 crc kubenswrapper[4882]: I1002 16:22:14.674138 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qnvl" event={"ID":"efdb4f07-c013-4677-86ec-bdda64483def","Type":"ContainerStarted","Data":"65f8bb56d1f418ce6d95f8690f843aea35a611abb2569292cf3bea81d46d5ac4"} Oct 02 16:22:14 crc kubenswrapper[4882]: I1002 16:22:14.680169 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wm7q" event={"ID":"b276f479-6426-4f8f-acd1-9823370944e2","Type":"ContainerStarted","Data":"1a6012324063e3a62cf7b0481443b533de954a1ef6163662b2501ec349260957"} Oct 02 16:22:14 crc kubenswrapper[4882]: I1002 16:22:14.696325 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2qnvl" podStartSLOduration=2.274882335 podStartE2EDuration="4.696302081s" podCreationTimestamp="2025-10-02 16:22:10 +0000 UTC" firstStartedPulling="2025-10-02 16:22:11.643416795 +0000 UTC m=+290.392646312" lastFinishedPulling="2025-10-02 16:22:14.064836531 +0000 UTC m=+292.814066058" observedRunningTime="2025-10-02 16:22:14.692942828 +0000 UTC m=+293.442172355" watchObservedRunningTime="2025-10-02 16:22:14.696302081 +0000 UTC m=+293.445531608" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.007456 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.008423 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.057506 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.079317 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5wm7q" podStartSLOduration=6.351813485 podStartE2EDuration="8.079282351s" podCreationTimestamp="2025-10-02 16:22:10 +0000 UTC" firstStartedPulling="2025-10-02 16:22:11.63997491 +0000 UTC m=+290.389204437" lastFinishedPulling="2025-10-02 16:22:13.367443736 +0000 UTC m=+292.116673303" observedRunningTime="2025-10-02 16:22:14.717798201 +0000 UTC m=+293.467027718" watchObservedRunningTime="2025-10-02 16:22:18.079282351 +0000 UTC m=+296.828511968" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.206126 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.206197 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.248479 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.742913 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zlgt4" Oct 02 16:22:18 crc kubenswrapper[4882]: I1002 16:22:18.749082 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hfrrl" Oct 02 16:22:20 crc kubenswrapper[4882]: I1002 16:22:20.407652 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:20 crc kubenswrapper[4882]: I1002 16:22:20.407712 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:20 crc kubenswrapper[4882]: I1002 16:22:20.450190 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:20 crc kubenswrapper[4882]: I1002 16:22:20.610615 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:20 crc kubenswrapper[4882]: I1002 16:22:20.610668 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:20 crc kubenswrapper[4882]: I1002 16:22:20.662761 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:22:20 crc kubenswrapper[4882]: I1002 16:22:20.751932 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2qnvl" Oct 02 16:22:20 crc kubenswrapper[4882]: I1002 16:22:20.756674 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5wm7q" Oct 02 16:23:39 crc kubenswrapper[4882]: I1002 16:23:39.390983 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:23:39 crc kubenswrapper[4882]: I1002 16:23:39.391730 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:24:09 crc kubenswrapper[4882]: I1002 16:24:09.390548 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:24:09 crc kubenswrapper[4882]: I1002 16:24:09.391670 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:24:32 crc kubenswrapper[4882]: I1002 16:24:32.965139 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p428v"] Oct 02 16:24:32 crc kubenswrapper[4882]: I1002 16:24:32.966729 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.037874 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p428v"] Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.129024 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.129090 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2694d049-4354-4787-979d-b22a370164fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.129115 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-registry-tls\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.129152 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-bound-sa-token\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.129180 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2694d049-4354-4787-979d-b22a370164fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.129221 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2694d049-4354-4787-979d-b22a370164fa-registry-certificates\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.129240 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2694d049-4354-4787-979d-b22a370164fa-trusted-ca\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.129290 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4g2\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-kube-api-access-zs4g2\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.152103 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.230311 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4g2\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-kube-api-access-zs4g2\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.230409 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2694d049-4354-4787-979d-b22a370164fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.230476 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-registry-tls\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.230551 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-bound-sa-token\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.230615 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2694d049-4354-4787-979d-b22a370164fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.230692 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2694d049-4354-4787-979d-b22a370164fa-registry-certificates\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.230745 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2694d049-4354-4787-979d-b22a370164fa-trusted-ca\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.231683 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2694d049-4354-4787-979d-b22a370164fa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.232421 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2694d049-4354-4787-979d-b22a370164fa-trusted-ca\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.233055 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2694d049-4354-4787-979d-b22a370164fa-registry-certificates\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.239395 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-registry-tls\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.240928 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2694d049-4354-4787-979d-b22a370164fa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.252835 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-bound-sa-token\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.259589 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4g2\" (UniqueName: \"kubernetes.io/projected/2694d049-4354-4787-979d-b22a370164fa-kube-api-access-zs4g2\") pod \"image-registry-66df7c8f76-p428v\" (UID: \"2694d049-4354-4787-979d-b22a370164fa\") " pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.288475 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:33 crc kubenswrapper[4882]: I1002 16:24:33.521920 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p428v"] Oct 02 16:24:34 crc kubenswrapper[4882]: I1002 16:24:34.514738 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p428v" event={"ID":"2694d049-4354-4787-979d-b22a370164fa","Type":"ContainerStarted","Data":"2cf158a0c2289b1d194c4840bd3da2c9dd2533cae26498f1a5f3ba1463933dc5"} Oct 02 16:24:34 crc kubenswrapper[4882]: I1002 16:24:34.514825 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p428v" event={"ID":"2694d049-4354-4787-979d-b22a370164fa","Type":"ContainerStarted","Data":"32acce58801771310da92c7134ed36c684f0774f19d8f55c03424e8432805c14"} Oct 02 16:24:34 crc kubenswrapper[4882]: I1002 16:24:34.514971 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:34 crc kubenswrapper[4882]: I1002 16:24:34.538532 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p428v" podStartSLOduration=2.538495406 podStartE2EDuration="2.538495406s" podCreationTimestamp="2025-10-02 16:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:24:34.536426044 +0000 UTC m=+433.285655571" watchObservedRunningTime="2025-10-02 16:24:34.538495406 +0000 UTC m=+433.287724953" Oct 02 16:24:39 crc kubenswrapper[4882]: I1002 16:24:39.390780 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:24:39 crc kubenswrapper[4882]: I1002 16:24:39.391176 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:24:39 crc kubenswrapper[4882]: I1002 16:24:39.391257 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:24:39 crc kubenswrapper[4882]: I1002 16:24:39.391924 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77c24662004c4356573f7635f0fdd3bb521f2ad2a8e7e5b7d9c1f4117b51aef8"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:24:39 crc kubenswrapper[4882]: I1002 16:24:39.391978 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://77c24662004c4356573f7635f0fdd3bb521f2ad2a8e7e5b7d9c1f4117b51aef8" gracePeriod=600 Oct 02 16:24:39 crc kubenswrapper[4882]: I1002 16:24:39.550402 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="77c24662004c4356573f7635f0fdd3bb521f2ad2a8e7e5b7d9c1f4117b51aef8" exitCode=0 Oct 02 16:24:39 crc kubenswrapper[4882]: I1002 16:24:39.550524 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"77c24662004c4356573f7635f0fdd3bb521f2ad2a8e7e5b7d9c1f4117b51aef8"} Oct 02 16:24:39 crc kubenswrapper[4882]: I1002 16:24:39.551075 4882 scope.go:117] "RemoveContainer" containerID="0752017273c71ed8326292bac2e18e20fee48079bf1f97ce01537a0ffbfc8b48" Oct 02 16:24:40 crc kubenswrapper[4882]: I1002 16:24:40.559416 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"29f4555583b1d199e6bae425394a26d6363317a57d30e13ad2c6217759f86807"} Oct 02 16:24:53 crc kubenswrapper[4882]: I1002 16:24:53.294603 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p428v" Oct 02 16:24:53 crc kubenswrapper[4882]: I1002 16:24:53.350250 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl2p9"] Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.386440 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" podUID="c207eb71-c634-4cc1-b3f7-720ddbb6dc56" containerName="registry" containerID="cri-o://1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853" gracePeriod=30 Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.736965 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.790651 4882 generic.go:334] "Generic (PLEG): container finished" podID="c207eb71-c634-4cc1-b3f7-720ddbb6dc56" containerID="1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853" exitCode=0 Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.790704 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" event={"ID":"c207eb71-c634-4cc1-b3f7-720ddbb6dc56","Type":"ContainerDied","Data":"1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853"} Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.790707 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.790736 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fl2p9" event={"ID":"c207eb71-c634-4cc1-b3f7-720ddbb6dc56","Type":"ContainerDied","Data":"edab5d4e0b3f245e89df82a16d00e20a92211e45de26510286d3404e29557a20"} Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.790753 4882 scope.go:117] "RemoveContainer" containerID="1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.806502 4882 scope.go:117] "RemoveContainer" containerID="1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853" Oct 02 16:25:18 crc kubenswrapper[4882]: E1002 16:25:18.806945 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853\": container with ID starting with 1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853 not found: ID does not exist" containerID="1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.806998 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853"} err="failed to get container status \"1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853\": rpc error: code = NotFound desc = could not find container \"1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853\": container with ID starting with 1edfa5331e5ff1d85cd42a2346e8eb9d2eb96678261739084d4c01cf5709e853 not found: ID does not exist" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.822265 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-bound-sa-token\") pod \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.822306 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-trusted-ca\") pod \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.822349 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-tls\") pod \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.822375 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-installation-pull-secrets\") pod \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.823163 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c207eb71-c634-4cc1-b3f7-720ddbb6dc56" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.829161 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c207eb71-c634-4cc1-b3f7-720ddbb6dc56" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.829564 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c207eb71-c634-4cc1-b3f7-720ddbb6dc56" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.829672 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c207eb71-c634-4cc1-b3f7-720ddbb6dc56" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.923449 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-ca-trust-extracted\") pod \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.923592 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.923623 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-certificates\") pod \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.923642 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pq49\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-kube-api-access-5pq49\") pod \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\" (UID: \"c207eb71-c634-4cc1-b3f7-720ddbb6dc56\") " Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.924048 4882 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.924082 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.924095 4882 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.924106 4882 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.924451 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c207eb71-c634-4cc1-b3f7-720ddbb6dc56" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.927830 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-kube-api-access-5pq49" (OuterVolumeSpecName: "kube-api-access-5pq49") pod "c207eb71-c634-4cc1-b3f7-720ddbb6dc56" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56"). InnerVolumeSpecName "kube-api-access-5pq49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.933872 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c207eb71-c634-4cc1-b3f7-720ddbb6dc56" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 16:25:18 crc kubenswrapper[4882]: I1002 16:25:18.941720 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c207eb71-c634-4cc1-b3f7-720ddbb6dc56" (UID: "c207eb71-c634-4cc1-b3f7-720ddbb6dc56"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:25:19 crc kubenswrapper[4882]: I1002 16:25:19.024740 4882 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 16:25:19 crc kubenswrapper[4882]: I1002 16:25:19.024778 4882 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 16:25:19 crc kubenswrapper[4882]: I1002 16:25:19.024794 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pq49\" (UniqueName: \"kubernetes.io/projected/c207eb71-c634-4cc1-b3f7-720ddbb6dc56-kube-api-access-5pq49\") on node \"crc\" DevicePath \"\"" Oct 02 16:25:19 crc kubenswrapper[4882]: I1002 16:25:19.140090 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl2p9"] Oct 02 16:25:19 crc kubenswrapper[4882]: I1002 16:25:19.147479 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fl2p9"] Oct 02 16:25:20 crc kubenswrapper[4882]: I1002 16:25:20.767995 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c207eb71-c634-4cc1-b3f7-720ddbb6dc56" path="/var/lib/kubelet/pods/c207eb71-c634-4cc1-b3f7-720ddbb6dc56/volumes" Oct 02 16:26:39 crc kubenswrapper[4882]: I1002 16:26:39.391053 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:26:39 crc kubenswrapper[4882]: I1002 16:26:39.391657 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:27:09 crc kubenswrapper[4882]: I1002 16:27:09.390788 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:27:09 crc kubenswrapper[4882]: I1002 16:27:09.391480 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:27:39 crc kubenswrapper[4882]: I1002 16:27:39.390432 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:27:39 crc kubenswrapper[4882]: I1002 16:27:39.391198 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:27:39 crc kubenswrapper[4882]: I1002 16:27:39.391313 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:27:39 crc kubenswrapper[4882]: I1002 16:27:39.392300 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29f4555583b1d199e6bae425394a26d6363317a57d30e13ad2c6217759f86807"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:27:39 crc kubenswrapper[4882]: I1002 16:27:39.392411 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://29f4555583b1d199e6bae425394a26d6363317a57d30e13ad2c6217759f86807" gracePeriod=600 Oct 02 16:27:39 crc kubenswrapper[4882]: I1002 16:27:39.673731 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="29f4555583b1d199e6bae425394a26d6363317a57d30e13ad2c6217759f86807" exitCode=0 Oct 02 16:27:39 crc kubenswrapper[4882]: I1002 16:27:39.673793 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"29f4555583b1d199e6bae425394a26d6363317a57d30e13ad2c6217759f86807"} Oct 02 16:27:39 crc kubenswrapper[4882]: I1002 16:27:39.673872 4882 scope.go:117] "RemoveContainer" containerID="77c24662004c4356573f7635f0fdd3bb521f2ad2a8e7e5b7d9c1f4117b51aef8" Oct 02 16:27:40 crc kubenswrapper[4882]: I1002 16:27:40.683252 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"731f80fa116efa00cebc80d74d3ffae0e209de2d426a9dc8596e8ee975fa0480"} Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.624278 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p6qjz"] Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.625769 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovn-controller" containerID="cri-o://fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6" gracePeriod=30 Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.625842 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="nbdb" containerID="cri-o://b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3" gracePeriod=30 Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.625954 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="northd" containerID="cri-o://7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70" gracePeriod=30 Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.626014 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8" gracePeriod=30 Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.626080 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kube-rbac-proxy-node" containerID="cri-o://d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be" gracePeriod=30 Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.626119 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovn-acl-logging" containerID="cri-o://3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d" gracePeriod=30 Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.626421 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="sbdb" containerID="cri-o://aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c" gracePeriod=30 Oct 02 16:29:20 crc kubenswrapper[4882]: I1002 16:29:20.671933 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" containerID="cri-o://e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2" gracePeriod=30 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.115014 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/3.log" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.118402 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovn-acl-logging/0.log" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.119106 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovn-controller/0.log" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.119655 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.176825 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j48m7"] Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177054 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovn-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177068 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovn-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177085 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177095 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177103 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="northd" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177110 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="northd" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177122 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kubecfg-setup" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177128 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kubecfg-setup" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177140 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177147 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177153 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177161 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177172 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177179 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177192 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovn-acl-logging" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177200 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovn-acl-logging" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177228 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kube-rbac-proxy-node" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177236 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kube-rbac-proxy-node" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177242 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="nbdb" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177250 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="nbdb" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177259 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="sbdb" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177266 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="sbdb" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.177279 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c207eb71-c634-4cc1-b3f7-720ddbb6dc56" containerName="registry" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.177285 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="c207eb71-c634-4cc1-b3f7-720ddbb6dc56" containerName="registry" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181030 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovn-acl-logging" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181064 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="c207eb71-c634-4cc1-b3f7-720ddbb6dc56" containerName="registry" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181074 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181087 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="sbdb" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181095 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181102 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovn-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181108 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181116 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="nbdb" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181125 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181132 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="northd" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181140 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="kube-rbac-proxy-node" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.181287 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181298 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.181310 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181315 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181407 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.181418 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerName="ovnkube-controller" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.183353 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289165 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-systemd-units\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289292 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289359 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-node-log\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289455 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-node-log" (OuterVolumeSpecName: "node-log") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289491 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7zv5\" (UniqueName: \"kubernetes.io/projected/7911af1a-fc82-463b-b72d-9c55e5073e45-kube-api-access-l7zv5\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289537 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-netns\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289589 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-netd\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289597 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289615 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289632 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-systemd\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289687 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-config\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289761 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7911af1a-fc82-463b-b72d-9c55e5073e45-ovn-node-metrics-cert\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289802 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-env-overrides\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289859 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-ovn-kubernetes\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289920 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-var-lib-openvswitch\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289965 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-etc-openvswitch\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.289997 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-openvswitch\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290040 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-kubelet\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290062 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290083 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-bin\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290120 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290155 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-log-socket\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290194 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-ovn\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290278 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-slash\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290325 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-script-lib\") pod \"7911af1a-fc82-463b-b72d-9c55e5073e45\" (UID: \"7911af1a-fc82-463b-b72d-9c55e5073e45\") " Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290125 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290373 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290155 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290184 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290247 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290283 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290308 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290342 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-log-socket" (OuterVolumeSpecName: "log-socket") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290430 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290441 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290466 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-slash" (OuterVolumeSpecName: "host-slash") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290616 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-etc-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290671 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290762 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-systemd-units\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290823 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-var-lib-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290872 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290905 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290918 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-ovnkube-script-lib\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.290987 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-ovnkube-config\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291020 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-cni-netd\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291058 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-ovn\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291101 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqmzj\" (UniqueName: \"kubernetes.io/projected/be76cf50-83d6-4728-9fd0-ef42e458b022-kube-api-access-pqmzj\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291155 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-run-netns\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291190 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-kubelet\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291256 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-log-socket\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291304 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be76cf50-83d6-4728-9fd0-ef42e458b022-ovn-node-metrics-cert\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291345 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-node-log\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291367 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-systemd\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291392 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-cni-bin\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291419 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-slash\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291466 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-env-overrides\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291498 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-run-ovn-kubernetes\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291598 4882 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291641 4882 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291671 4882 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291697 4882 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291722 4882 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291747 4882 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291772 4882 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291797 4882 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7911af1a-fc82-463b-b72d-9c55e5073e45-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291822 4882 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291849 4882 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291875 4882 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291900 4882 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291925 4882 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291951 4882 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291972 4882 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.291997 4882 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.292052 4882 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.296127 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7911af1a-fc82-463b-b72d-9c55e5073e45-kube-api-access-l7zv5" (OuterVolumeSpecName: "kube-api-access-l7zv5") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "kube-api-access-l7zv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.296250 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7911af1a-fc82-463b-b72d-9c55e5073e45-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.304099 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7911af1a-fc82-463b-b72d-9c55e5073e45" (UID: "7911af1a-fc82-463b-b72d-9c55e5073e45"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.356753 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/2.log" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.357348 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/1.log" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.357402 4882 generic.go:334] "Generic (PLEG): container finished" podID="565a5a5f-e220-4ce6-86a7-f94f9dbe48c2" containerID="7aa35c2662c0805faaf732c1db3dfb6ccb2d6c73ddd56a45203e524734283fef" exitCode=2 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.357487 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppcpg" event={"ID":"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2","Type":"ContainerDied","Data":"7aa35c2662c0805faaf732c1db3dfb6ccb2d6c73ddd56a45203e524734283fef"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.357547 4882 scope.go:117] "RemoveContainer" containerID="ca5c7c9e80926f7329bd05bf3c1a2510eec871c976cc91393a802362e3e8019e" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.357950 4882 scope.go:117] "RemoveContainer" containerID="7aa35c2662c0805faaf732c1db3dfb6ccb2d6c73ddd56a45203e524734283fef" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.359076 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ppcpg_openshift-multus(565a5a5f-e220-4ce6-86a7-f94f9dbe48c2)\"" pod="openshift-multus/multus-ppcpg" podUID="565a5a5f-e220-4ce6-86a7-f94f9dbe48c2" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.360250 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovnkube-controller/3.log" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.362985 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovn-acl-logging/0.log" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.363710 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p6qjz_7911af1a-fc82-463b-b72d-9c55e5073e45/ovn-controller/0.log" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364137 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2" exitCode=0 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364185 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c" exitCode=0 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364196 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3" exitCode=0 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364230 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364207 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70" exitCode=0 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364274 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8" exitCode=0 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364279 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364285 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be" exitCode=0 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364293 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364296 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d" exitCode=143 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364304 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364307 4882 generic.go:334] "Generic (PLEG): container finished" podID="7911af1a-fc82-463b-b72d-9c55e5073e45" containerID="fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6" exitCode=143 Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364316 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364317 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364328 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364411 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364423 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364429 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364435 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364442 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364448 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364454 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364460 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364465 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364471 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364478 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364486 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364492 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364497 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364503 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364508 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364513 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364518 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364523 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364529 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364535 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364542 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364551 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364558 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364563 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364568 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364573 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364578 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364583 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364589 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364593 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364599 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364605 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p6qjz" event={"ID":"7911af1a-fc82-463b-b72d-9c55e5073e45","Type":"ContainerDied","Data":"fdec1172613743de8c303b4c53b53335599821745f00a282075465cd5c22b253"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364613 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364619 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364625 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364631 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364638 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364646 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364653 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364659 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364666 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.364672 4882 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394174 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-etc-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394245 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394278 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-systemd-units\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394307 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-var-lib-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394331 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394358 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-ovnkube-script-lib\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394419 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-ovnkube-config\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394444 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-cni-netd\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394466 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-ovn\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394487 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqmzj\" (UniqueName: \"kubernetes.io/projected/be76cf50-83d6-4728-9fd0-ef42e458b022-kube-api-access-pqmzj\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394515 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-run-netns\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394543 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-kubelet\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394563 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-log-socket\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394587 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be76cf50-83d6-4728-9fd0-ef42e458b022-ovn-node-metrics-cert\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394611 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-node-log\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394627 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-systemd\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394649 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-cni-bin\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394667 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-slash\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394688 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-env-overrides\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394708 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-run-ovn-kubernetes\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394759 4882 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7911af1a-fc82-463b-b72d-9c55e5073e45-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394774 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7zv5\" (UniqueName: \"kubernetes.io/projected/7911af1a-fc82-463b-b72d-9c55e5073e45-kube-api-access-l7zv5\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394784 4882 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7911af1a-fc82-463b-b72d-9c55e5073e45-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394828 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-run-ovn-kubernetes\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394867 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-etc-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394895 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394923 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-systemd-units\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394956 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-var-lib-openvswitch\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.394990 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.395778 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-ovnkube-script-lib\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.396379 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-ovnkube-config\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.396420 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-cni-netd\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.396451 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-ovn\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.396755 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-run-netns\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.396788 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-kubelet\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.396983 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-log-socket\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.398570 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-cni-bin\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.398683 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-node-log\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.398825 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-run-systemd\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.398859 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be76cf50-83d6-4728-9fd0-ef42e458b022-host-slash\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.399401 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be76cf50-83d6-4728-9fd0-ef42e458b022-env-overrides\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.400458 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be76cf50-83d6-4728-9fd0-ef42e458b022-ovn-node-metrics-cert\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.407353 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p6qjz"] Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.415469 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqmzj\" (UniqueName: \"kubernetes.io/projected/be76cf50-83d6-4728-9fd0-ef42e458b022-kube-api-access-pqmzj\") pod \"ovnkube-node-j48m7\" (UID: \"be76cf50-83d6-4728-9fd0-ef42e458b022\") " pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.417421 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p6qjz"] Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.481862 4882 scope.go:117] "RemoveContainer" containerID="e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.501330 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.505112 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.525682 4882 scope.go:117] "RemoveContainer" containerID="aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.545726 4882 scope.go:117] "RemoveContainer" containerID="b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.572543 4882 scope.go:117] "RemoveContainer" containerID="7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.588917 4882 scope.go:117] "RemoveContainer" containerID="13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.604517 4882 scope.go:117] "RemoveContainer" containerID="d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.644449 4882 scope.go:117] "RemoveContainer" containerID="3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.665707 4882 scope.go:117] "RemoveContainer" containerID="fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.680096 4882 scope.go:117] "RemoveContainer" containerID="b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.692001 4882 scope.go:117] "RemoveContainer" containerID="e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.692393 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": container with ID starting with e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2 not found: ID does not exist" containerID="e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.692436 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} err="failed to get container status \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": rpc error: code = NotFound desc = could not find container \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": container with ID starting with e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.692464 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.692899 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": container with ID starting with 8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d not found: ID does not exist" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.692944 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} err="failed to get container status \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": rpc error: code = NotFound desc = could not find container \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": container with ID starting with 8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.692970 4882 scope.go:117] "RemoveContainer" containerID="aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.693392 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": container with ID starting with aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c not found: ID does not exist" containerID="aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.693425 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} err="failed to get container status \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": rpc error: code = NotFound desc = could not find container \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": container with ID starting with aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.693445 4882 scope.go:117] "RemoveContainer" containerID="b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.693789 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": container with ID starting with b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3 not found: ID does not exist" containerID="b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.693815 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} err="failed to get container status \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": rpc error: code = NotFound desc = could not find container \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": container with ID starting with b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.693833 4882 scope.go:117] "RemoveContainer" containerID="7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.694162 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": container with ID starting with 7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70 not found: ID does not exist" containerID="7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.694190 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} err="failed to get container status \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": rpc error: code = NotFound desc = could not find container \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": container with ID starting with 7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.694228 4882 scope.go:117] "RemoveContainer" containerID="13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.694464 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": container with ID starting with 13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8 not found: ID does not exist" containerID="13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.694514 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} err="failed to get container status \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": rpc error: code = NotFound desc = could not find container \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": container with ID starting with 13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.694543 4882 scope.go:117] "RemoveContainer" containerID="d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.694979 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": container with ID starting with d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be not found: ID does not exist" containerID="d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.695008 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} err="failed to get container status \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": rpc error: code = NotFound desc = could not find container \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": container with ID starting with d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.695025 4882 scope.go:117] "RemoveContainer" containerID="3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.695376 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": container with ID starting with 3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d not found: ID does not exist" containerID="3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.695409 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} err="failed to get container status \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": rpc error: code = NotFound desc = could not find container \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": container with ID starting with 3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.695430 4882 scope.go:117] "RemoveContainer" containerID="fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.695669 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": container with ID starting with fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6 not found: ID does not exist" containerID="fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.695705 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} err="failed to get container status \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": rpc error: code = NotFound desc = could not find container \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": container with ID starting with fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.695727 4882 scope.go:117] "RemoveContainer" containerID="b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00" Oct 02 16:29:21 crc kubenswrapper[4882]: E1002 16:29:21.696042 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": container with ID starting with b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00 not found: ID does not exist" containerID="b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.696068 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} err="failed to get container status \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": rpc error: code = NotFound desc = could not find container \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": container with ID starting with b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.696089 4882 scope.go:117] "RemoveContainer" containerID="e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.696404 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} err="failed to get container status \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": rpc error: code = NotFound desc = could not find container \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": container with ID starting with e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.696426 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.696699 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} err="failed to get container status \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": rpc error: code = NotFound desc = could not find container \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": container with ID starting with 8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.696729 4882 scope.go:117] "RemoveContainer" containerID="aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.696968 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} err="failed to get container status \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": rpc error: code = NotFound desc = could not find container \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": container with ID starting with aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.696990 4882 scope.go:117] "RemoveContainer" containerID="b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.697388 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} err="failed to get container status \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": rpc error: code = NotFound desc = could not find container \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": container with ID starting with b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.697418 4882 scope.go:117] "RemoveContainer" containerID="7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.697709 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} err="failed to get container status \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": rpc error: code = NotFound desc = could not find container \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": container with ID starting with 7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.697740 4882 scope.go:117] "RemoveContainer" containerID="13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.698022 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} err="failed to get container status \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": rpc error: code = NotFound desc = could not find container \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": container with ID starting with 13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.698050 4882 scope.go:117] "RemoveContainer" containerID="d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.698377 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} err="failed to get container status \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": rpc error: code = NotFound desc = could not find container \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": container with ID starting with d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.698401 4882 scope.go:117] "RemoveContainer" containerID="3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.698650 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} err="failed to get container status \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": rpc error: code = NotFound desc = could not find container \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": container with ID starting with 3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.698674 4882 scope.go:117] "RemoveContainer" containerID="fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.698917 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} err="failed to get container status \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": rpc error: code = NotFound desc = could not find container \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": container with ID starting with fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.698948 4882 scope.go:117] "RemoveContainer" containerID="b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.699315 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} err="failed to get container status \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": rpc error: code = NotFound desc = could not find container \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": container with ID starting with b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.699345 4882 scope.go:117] "RemoveContainer" containerID="e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.699599 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} err="failed to get container status \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": rpc error: code = NotFound desc = could not find container \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": container with ID starting with e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.699633 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.700068 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} err="failed to get container status \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": rpc error: code = NotFound desc = could not find container \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": container with ID starting with 8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.700096 4882 scope.go:117] "RemoveContainer" containerID="aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.700351 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} err="failed to get container status \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": rpc error: code = NotFound desc = could not find container \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": container with ID starting with aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.700382 4882 scope.go:117] "RemoveContainer" containerID="b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.700601 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} err="failed to get container status \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": rpc error: code = NotFound desc = could not find container \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": container with ID starting with b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.700626 4882 scope.go:117] "RemoveContainer" containerID="7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.700815 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} err="failed to get container status \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": rpc error: code = NotFound desc = could not find container \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": container with ID starting with 7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.700839 4882 scope.go:117] "RemoveContainer" containerID="13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.701045 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} err="failed to get container status \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": rpc error: code = NotFound desc = could not find container \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": container with ID starting with 13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.701075 4882 scope.go:117] "RemoveContainer" containerID="d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.701750 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} err="failed to get container status \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": rpc error: code = NotFound desc = could not find container \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": container with ID starting with d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.701775 4882 scope.go:117] "RemoveContainer" containerID="3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.701980 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} err="failed to get container status \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": rpc error: code = NotFound desc = could not find container \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": container with ID starting with 3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.702010 4882 scope.go:117] "RemoveContainer" containerID="fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.702248 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} err="failed to get container status \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": rpc error: code = NotFound desc = could not find container \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": container with ID starting with fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.702274 4882 scope.go:117] "RemoveContainer" containerID="b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.702508 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} err="failed to get container status \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": rpc error: code = NotFound desc = could not find container \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": container with ID starting with b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.702533 4882 scope.go:117] "RemoveContainer" containerID="e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.702933 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2"} err="failed to get container status \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": rpc error: code = NotFound desc = could not find container \"e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2\": container with ID starting with e0bac649aea31d155ff376d6716be607588a492f1d1e3e70675752640be8c8d2 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.702961 4882 scope.go:117] "RemoveContainer" containerID="8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.703266 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d"} err="failed to get container status \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": rpc error: code = NotFound desc = could not find container \"8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d\": container with ID starting with 8b5fa03006355491fcc756e1230d1df8b612b951dcb0a6cfc63a27790126d75d not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.703289 4882 scope.go:117] "RemoveContainer" containerID="aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.703512 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c"} err="failed to get container status \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": rpc error: code = NotFound desc = could not find container \"aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c\": container with ID starting with aeb926bcf3ed9f09431d74fdb358a747fb3bb67698de3b7a5869e40f96a6c91c not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.703538 4882 scope.go:117] "RemoveContainer" containerID="b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.703861 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3"} err="failed to get container status \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": rpc error: code = NotFound desc = could not find container \"b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3\": container with ID starting with b84e9760d7a5547197a8b66b434be150817f7800d8bb852f77b5051f327475e3 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.703888 4882 scope.go:117] "RemoveContainer" containerID="7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.704179 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70"} err="failed to get container status \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": rpc error: code = NotFound desc = could not find container \"7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70\": container with ID starting with 7d68d85afb64a853d7d470007ceae252fb8af90dad119a114b8ed783edc52f70 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.704227 4882 scope.go:117] "RemoveContainer" containerID="13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.704524 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8"} err="failed to get container status \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": rpc error: code = NotFound desc = could not find container \"13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8\": container with ID starting with 13bd4d9558dc97b5134d346aa479d804dfafd2d923050902d4915edb49a23ee8 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.704548 4882 scope.go:117] "RemoveContainer" containerID="d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.704866 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be"} err="failed to get container status \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": rpc error: code = NotFound desc = could not find container \"d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be\": container with ID starting with d8efb4b292af0897c5163c026f52dcf7495904213a82644f20c3bebe6c39a6be not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.704912 4882 scope.go:117] "RemoveContainer" containerID="3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.705149 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d"} err="failed to get container status \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": rpc error: code = NotFound desc = could not find container \"3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d\": container with ID starting with 3220e7a3926892b0ad9663764e3adb2fdd81fb3398a7ed9f31e1b76aee28331d not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.705173 4882 scope.go:117] "RemoveContainer" containerID="fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.705496 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6"} err="failed to get container status \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": rpc error: code = NotFound desc = could not find container \"fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6\": container with ID starting with fc3b2705f4242f62ea59aab93a8dba4816c61698e5c05f4489d2962b37b1b3d6 not found: ID does not exist" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.705523 4882 scope.go:117] "RemoveContainer" containerID="b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00" Oct 02 16:29:21 crc kubenswrapper[4882]: I1002 16:29:21.705818 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00"} err="failed to get container status \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": rpc error: code = NotFound desc = could not find container \"b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00\": container with ID starting with b6ca33ca8defe8e2aed42f7deba02069727eaf63d0536fbcb25ab83ad6d5cd00 not found: ID does not exist" Oct 02 16:29:22 crc kubenswrapper[4882]: I1002 16:29:22.372512 4882 generic.go:334] "Generic (PLEG): container finished" podID="be76cf50-83d6-4728-9fd0-ef42e458b022" containerID="0749c448480d579b0d89d724d1454e6c2f078666ac083debb9978fd2cc805972" exitCode=0 Oct 02 16:29:22 crc kubenswrapper[4882]: I1002 16:29:22.372690 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerDied","Data":"0749c448480d579b0d89d724d1454e6c2f078666ac083debb9978fd2cc805972"} Oct 02 16:29:22 crc kubenswrapper[4882]: I1002 16:29:22.372998 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"7bf338c8d7ea497e7cb555c299776707002a896f32723ace47a7bea2f794b8b4"} Oct 02 16:29:22 crc kubenswrapper[4882]: I1002 16:29:22.375711 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/2.log" Oct 02 16:29:22 crc kubenswrapper[4882]: I1002 16:29:22.778225 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7911af1a-fc82-463b-b72d-9c55e5073e45" path="/var/lib/kubelet/pods/7911af1a-fc82-463b-b72d-9c55e5073e45/volumes" Oct 02 16:29:23 crc kubenswrapper[4882]: I1002 16:29:23.387563 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"0b50efa2ec42ab1d04b9abc952b0f1dae6785ef3033aeef47822f76fb1786f33"} Oct 02 16:29:23 crc kubenswrapper[4882]: I1002 16:29:23.387875 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"df89a4bb2fc4d70a8453ee8d8487667cb665f85e8e155ddad8db077bad457152"} Oct 02 16:29:23 crc kubenswrapper[4882]: I1002 16:29:23.387886 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"85449bdcba206907e3f7d2aa5c5081b6456f031aab4e549828613c1ebec3dced"} Oct 02 16:29:24 crc kubenswrapper[4882]: I1002 16:29:24.397658 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"72ac06d5822a675f24f275efb627ac998f1d28478989c29e7477c3d35342ee4d"} Oct 02 16:29:24 crc kubenswrapper[4882]: I1002 16:29:24.398053 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"e365b5e15020556041ef9386a870b851ead23344ede4cbab1855710940f67b81"} Oct 02 16:29:24 crc kubenswrapper[4882]: I1002 16:29:24.398073 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"57f7da2f0ee26062490b20aafc7e7e816db5db107fc3560bf98ee3f1275f0dec"} Oct 02 16:29:27 crc kubenswrapper[4882]: I1002 16:29:27.417375 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"d170a8dc8613c705943790f33053d3ff0378592f2c020f27654729ba8a04e51d"} Oct 02 16:29:27 crc kubenswrapper[4882]: I1002 16:29:27.972643 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-7kz2r"] Oct 02 16:29:27 crc kubenswrapper[4882]: I1002 16:29:27.973297 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:27 crc kubenswrapper[4882]: I1002 16:29:27.974923 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 02 16:29:27 crc kubenswrapper[4882]: I1002 16:29:27.975630 4882 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-k8n9z" Oct 02 16:29:27 crc kubenswrapper[4882]: I1002 16:29:27.976375 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 02 16:29:27 crc kubenswrapper[4882]: I1002 16:29:27.976377 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.077667 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6gk\" (UniqueName: \"kubernetes.io/projected/9a3aa279-1314-4a0b-8418-8556d441d03d-kube-api-access-mt6gk\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.077848 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a3aa279-1314-4a0b-8418-8556d441d03d-node-mnt\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.077892 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a3aa279-1314-4a0b-8418-8556d441d03d-crc-storage\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.179319 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a3aa279-1314-4a0b-8418-8556d441d03d-node-mnt\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.179452 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a3aa279-1314-4a0b-8418-8556d441d03d-crc-storage\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.179498 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6gk\" (UniqueName: \"kubernetes.io/projected/9a3aa279-1314-4a0b-8418-8556d441d03d-kube-api-access-mt6gk\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.179680 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a3aa279-1314-4a0b-8418-8556d441d03d-node-mnt\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.180247 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a3aa279-1314-4a0b-8418-8556d441d03d-crc-storage\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.203384 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6gk\" (UniqueName: \"kubernetes.io/projected/9a3aa279-1314-4a0b-8418-8556d441d03d-kube-api-access-mt6gk\") pod \"crc-storage-crc-7kz2r\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: I1002 16:29:28.294838 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: E1002 16:29:28.330092 4882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7kz2r_crc-storage_9a3aa279-1314-4a0b-8418-8556d441d03d_0(b4cb35f7bee2e5db06b3dc5b64c3a77279d2e41bf726a3cc8a0769b5c1984ae4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 16:29:28 crc kubenswrapper[4882]: E1002 16:29:28.330157 4882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7kz2r_crc-storage_9a3aa279-1314-4a0b-8418-8556d441d03d_0(b4cb35f7bee2e5db06b3dc5b64c3a77279d2e41bf726a3cc8a0769b5c1984ae4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: E1002 16:29:28.330179 4882 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7kz2r_crc-storage_9a3aa279-1314-4a0b-8418-8556d441d03d_0(b4cb35f7bee2e5db06b3dc5b64c3a77279d2e41bf726a3cc8a0769b5c1984ae4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:28 crc kubenswrapper[4882]: E1002 16:29:28.330251 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-7kz2r_crc-storage(9a3aa279-1314-4a0b-8418-8556d441d03d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-7kz2r_crc-storage(9a3aa279-1314-4a0b-8418-8556d441d03d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7kz2r_crc-storage_9a3aa279-1314-4a0b-8418-8556d441d03d_0(b4cb35f7bee2e5db06b3dc5b64c3a77279d2e41bf726a3cc8a0769b5c1984ae4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-7kz2r" podUID="9a3aa279-1314-4a0b-8418-8556d441d03d" Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.431783 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" event={"ID":"be76cf50-83d6-4728-9fd0-ef42e458b022","Type":"ContainerStarted","Data":"e01c41d47508e0b8c0867d59c9cad3796b105bfd625beabe9589303c452bc01b"} Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.432251 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.432281 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.432293 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.455538 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.471175 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" podStartSLOduration=8.471148467 podStartE2EDuration="8.471148467s" podCreationTimestamp="2025-10-02 16:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:29:29.457937612 +0000 UTC m=+728.207167169" watchObservedRunningTime="2025-10-02 16:29:29.471148467 +0000 UTC m=+728.220378014" Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.480403 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.508124 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7kz2r"] Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.508330 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:29 crc kubenswrapper[4882]: I1002 16:29:29.508957 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:29 crc kubenswrapper[4882]: E1002 16:29:29.533403 4882 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7kz2r_crc-storage_9a3aa279-1314-4a0b-8418-8556d441d03d_0(e42d6c160fc365870fe49f3bda471b1f2f3ab0cd028d5229412e2fc55d990cd4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 16:29:29 crc kubenswrapper[4882]: E1002 16:29:29.533483 4882 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7kz2r_crc-storage_9a3aa279-1314-4a0b-8418-8556d441d03d_0(e42d6c160fc365870fe49f3bda471b1f2f3ab0cd028d5229412e2fc55d990cd4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:29 crc kubenswrapper[4882]: E1002 16:29:29.533510 4882 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7kz2r_crc-storage_9a3aa279-1314-4a0b-8418-8556d441d03d_0(e42d6c160fc365870fe49f3bda471b1f2f3ab0cd028d5229412e2fc55d990cd4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:29 crc kubenswrapper[4882]: E1002 16:29:29.533562 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-7kz2r_crc-storage(9a3aa279-1314-4a0b-8418-8556d441d03d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-7kz2r_crc-storage(9a3aa279-1314-4a0b-8418-8556d441d03d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7kz2r_crc-storage_9a3aa279-1314-4a0b-8418-8556d441d03d_0(e42d6c160fc365870fe49f3bda471b1f2f3ab0cd028d5229412e2fc55d990cd4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-7kz2r" podUID="9a3aa279-1314-4a0b-8418-8556d441d03d" Oct 02 16:29:35 crc kubenswrapper[4882]: I1002 16:29:35.760644 4882 scope.go:117] "RemoveContainer" containerID="7aa35c2662c0805faaf732c1db3dfb6ccb2d6c73ddd56a45203e524734283fef" Oct 02 16:29:37 crc kubenswrapper[4882]: I1002 16:29:37.476532 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ppcpg_565a5a5f-e220-4ce6-86a7-f94f9dbe48c2/kube-multus/2.log" Oct 02 16:29:37 crc kubenswrapper[4882]: I1002 16:29:37.476890 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ppcpg" event={"ID":"565a5a5f-e220-4ce6-86a7-f94f9dbe48c2","Type":"ContainerStarted","Data":"6b046a82c63d996eb9cf49de2f2fa4b7467450f5c3cfe00b136f0fe3b36e578a"} Oct 02 16:29:41 crc kubenswrapper[4882]: I1002 16:29:41.759587 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:41 crc kubenswrapper[4882]: I1002 16:29:41.760688 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:29:41 crc kubenswrapper[4882]: I1002 16:29:41.989456 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7kz2r"] Oct 02 16:29:41 crc kubenswrapper[4882]: W1002 16:29:41.999411 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a3aa279_1314_4a0b_8418_8556d441d03d.slice/crio-edcb3b7943bd4fe633dffbab17ce8100e62086e6f76331e0b300484a431a3ebe WatchSource:0}: Error finding container edcb3b7943bd4fe633dffbab17ce8100e62086e6f76331e0b300484a431a3ebe: Status 404 returned error can't find the container with id edcb3b7943bd4fe633dffbab17ce8100e62086e6f76331e0b300484a431a3ebe Oct 02 16:29:42 crc kubenswrapper[4882]: I1002 16:29:42.001141 4882 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 16:29:42 crc kubenswrapper[4882]: I1002 16:29:42.507141 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7kz2r" event={"ID":"9a3aa279-1314-4a0b-8418-8556d441d03d","Type":"ContainerStarted","Data":"edcb3b7943bd4fe633dffbab17ce8100e62086e6f76331e0b300484a431a3ebe"} Oct 02 16:29:51 crc kubenswrapper[4882]: I1002 16:29:51.530674 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j48m7" Oct 02 16:29:54 crc kubenswrapper[4882]: I1002 16:29:54.547635 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dnsbk"] Oct 02 16:29:54 crc kubenswrapper[4882]: I1002 16:29:54.547866 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" podUID="6cfd762f-4be9-49a4-9851-f3211e11e6ad" containerName="controller-manager" containerID="cri-o://818b28340b1678d2be5b3cb17613fee6374ca8a6dd5566988b18415a113cbbd8" gracePeriod=30 Oct 02 16:29:54 crc kubenswrapper[4882]: I1002 16:29:54.649569 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth"] Oct 02 16:29:54 crc kubenswrapper[4882]: I1002 16:29:54.650312 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" podUID="a182d0c0-45a9-450b-affc-44caf339abd8" containerName="route-controller-manager" containerID="cri-o://7b07e2c779a6fbf4f475ce9cf1c343e6032cfd7a83cdca28a3cd301b468c1782" gracePeriod=30 Oct 02 16:29:57 crc kubenswrapper[4882]: I1002 16:29:57.594170 4882 generic.go:334] "Generic (PLEG): container finished" podID="a182d0c0-45a9-450b-affc-44caf339abd8" containerID="7b07e2c779a6fbf4f475ce9cf1c343e6032cfd7a83cdca28a3cd301b468c1782" exitCode=0 Oct 02 16:29:57 crc kubenswrapper[4882]: I1002 16:29:57.594346 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" event={"ID":"a182d0c0-45a9-450b-affc-44caf339abd8","Type":"ContainerDied","Data":"7b07e2c779a6fbf4f475ce9cf1c343e6032cfd7a83cdca28a3cd301b468c1782"} Oct 02 16:29:57 crc kubenswrapper[4882]: I1002 16:29:57.597999 4882 generic.go:334] "Generic (PLEG): container finished" podID="6cfd762f-4be9-49a4-9851-f3211e11e6ad" containerID="818b28340b1678d2be5b3cb17613fee6374ca8a6dd5566988b18415a113cbbd8" exitCode=0 Oct 02 16:29:57 crc kubenswrapper[4882]: I1002 16:29:57.598051 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" event={"ID":"6cfd762f-4be9-49a4-9851-f3211e11e6ad","Type":"ContainerDied","Data":"818b28340b1678d2be5b3cb17613fee6374ca8a6dd5566988b18415a113cbbd8"} Oct 02 16:29:58 crc kubenswrapper[4882]: I1002 16:29:58.587378 4882 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 16:29:58 crc kubenswrapper[4882]: I1002 16:29:58.821839 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:29:58 crc kubenswrapper[4882]: I1002 16:29:58.850104 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h"] Oct 02 16:29:58 crc kubenswrapper[4882]: E1002 16:29:58.850414 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfd762f-4be9-49a4-9851-f3211e11e6ad" containerName="controller-manager" Oct 02 16:29:58 crc kubenswrapper[4882]: I1002 16:29:58.850431 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfd762f-4be9-49a4-9851-f3211e11e6ad" containerName="controller-manager" Oct 02 16:29:58 crc kubenswrapper[4882]: I1002 16:29:58.850550 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfd762f-4be9-49a4-9851-f3211e11e6ad" containerName="controller-manager" Oct 02 16:29:58 crc kubenswrapper[4882]: I1002 16:29:58.850900 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:58 crc kubenswrapper[4882]: I1002 16:29:58.885676 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h"] Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.017922 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-client-ca\") pod \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018194 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfd762f-4be9-49a4-9851-f3211e11e6ad-serving-cert\") pod \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018286 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgwv6\" (UniqueName: \"kubernetes.io/projected/6cfd762f-4be9-49a4-9851-f3211e11e6ad-kube-api-access-pgwv6\") pod \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018306 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-proxy-ca-bundles\") pod \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018378 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-config\") pod \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\" (UID: \"6cfd762f-4be9-49a4-9851-f3211e11e6ad\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018539 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-config\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018589 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-client-ca\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018609 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f9b718b-4236-4229-a787-af7839578f14-serving-cert\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018641 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2lwz\" (UniqueName: \"kubernetes.io/projected/0f9b718b-4236-4229-a787-af7839578f14-kube-api-access-f2lwz\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.018680 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-proxy-ca-bundles\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.020393 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6cfd762f-4be9-49a4-9851-f3211e11e6ad" (UID: "6cfd762f-4be9-49a4-9851-f3211e11e6ad"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.020415 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-config" (OuterVolumeSpecName: "config") pod "6cfd762f-4be9-49a4-9851-f3211e11e6ad" (UID: "6cfd762f-4be9-49a4-9851-f3211e11e6ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.022005 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-client-ca" (OuterVolumeSpecName: "client-ca") pod "6cfd762f-4be9-49a4-9851-f3211e11e6ad" (UID: "6cfd762f-4be9-49a4-9851-f3211e11e6ad"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.025823 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfd762f-4be9-49a4-9851-f3211e11e6ad-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6cfd762f-4be9-49a4-9851-f3211e11e6ad" (UID: "6cfd762f-4be9-49a4-9851-f3211e11e6ad"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.027425 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfd762f-4be9-49a4-9851-f3211e11e6ad-kube-api-access-pgwv6" (OuterVolumeSpecName: "kube-api-access-pgwv6") pod "6cfd762f-4be9-49a4-9851-f3211e11e6ad" (UID: "6cfd762f-4be9-49a4-9851-f3211e11e6ad"). InnerVolumeSpecName "kube-api-access-pgwv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.084178 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.122958 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-config\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123060 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-client-ca\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123084 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f9b718b-4236-4229-a787-af7839578f14-serving-cert\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123107 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2lwz\" (UniqueName: \"kubernetes.io/projected/0f9b718b-4236-4229-a787-af7839578f14-kube-api-access-f2lwz\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123159 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-proxy-ca-bundles\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123239 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123260 4882 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123273 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfd762f-4be9-49a4-9851-f3211e11e6ad-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123283 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgwv6\" (UniqueName: \"kubernetes.io/projected/6cfd762f-4be9-49a4-9851-f3211e11e6ad-kube-api-access-pgwv6\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.123293 4882 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cfd762f-4be9-49a4-9851-f3211e11e6ad-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.124432 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-proxy-ca-bundles\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.124868 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-config\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.126447 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f9b718b-4236-4229-a787-af7839578f14-client-ca\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.127332 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f9b718b-4236-4229-a787-af7839578f14-serving-cert\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.143001 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2lwz\" (UniqueName: \"kubernetes.io/projected/0f9b718b-4236-4229-a787-af7839578f14-kube-api-access-f2lwz\") pod \"controller-manager-5b4b6654fd-ddx5h\" (UID: \"0f9b718b-4236-4229-a787-af7839578f14\") " pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.170782 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:29:59 crc kubenswrapper[4882]: E1002 16:29:59.217371 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/bash:latest" Oct 02 16:29:59 crc kubenswrapper[4882]: E1002 16:29:59.217525 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:storage,Image:quay.io/openstack-k8s-operators/bash:latest,Command:[bash],Args:[/usr/local/bin/crc-storage.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:PV_NUM,Value:12,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:crc-storage,ReadOnly:true,MountPath:/usr/local/bin/crc-storage.sh,SubPath:create-storage.sh,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:node-mnt,ReadOnly:false,MountPath:/mnt/nodeMnt,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mt6gk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-storage-crc-7kz2r_crc-storage(9a3aa279-1314-4a0b-8418-8556d441d03d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 16:29:59 crc kubenswrapper[4882]: E1002 16:29:59.218701 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="crc-storage/crc-storage-crc-7kz2r" podUID="9a3aa279-1314-4a0b-8418-8556d441d03d" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.225150 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-config\") pod \"a182d0c0-45a9-450b-affc-44caf339abd8\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.225201 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74p89\" (UniqueName: \"kubernetes.io/projected/a182d0c0-45a9-450b-affc-44caf339abd8-kube-api-access-74p89\") pod \"a182d0c0-45a9-450b-affc-44caf339abd8\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.225256 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-client-ca\") pod \"a182d0c0-45a9-450b-affc-44caf339abd8\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.225277 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a182d0c0-45a9-450b-affc-44caf339abd8-serving-cert\") pod \"a182d0c0-45a9-450b-affc-44caf339abd8\" (UID: \"a182d0c0-45a9-450b-affc-44caf339abd8\") " Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.226714 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-client-ca" (OuterVolumeSpecName: "client-ca") pod "a182d0c0-45a9-450b-affc-44caf339abd8" (UID: "a182d0c0-45a9-450b-affc-44caf339abd8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.226848 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-config" (OuterVolumeSpecName: "config") pod "a182d0c0-45a9-450b-affc-44caf339abd8" (UID: "a182d0c0-45a9-450b-affc-44caf339abd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.231767 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a182d0c0-45a9-450b-affc-44caf339abd8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a182d0c0-45a9-450b-affc-44caf339abd8" (UID: "a182d0c0-45a9-450b-affc-44caf339abd8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.231911 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a182d0c0-45a9-450b-affc-44caf339abd8-kube-api-access-74p89" (OuterVolumeSpecName: "kube-api-access-74p89") pod "a182d0c0-45a9-450b-affc-44caf339abd8" (UID: "a182d0c0-45a9-450b-affc-44caf339abd8"). InnerVolumeSpecName "kube-api-access-74p89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.326684 4882 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.326735 4882 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a182d0c0-45a9-450b-affc-44caf339abd8-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.326753 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a182d0c0-45a9-450b-affc-44caf339abd8-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.326771 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74p89\" (UniqueName: \"kubernetes.io/projected/a182d0c0-45a9-450b-affc-44caf339abd8-kube-api-access-74p89\") on node \"crc\" DevicePath \"\"" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.379940 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h"] Oct 02 16:29:59 crc kubenswrapper[4882]: W1002 16:29:59.391110 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f9b718b_4236_4229_a787_af7839578f14.slice/crio-ab3bd41ba536776dfb2cdbe1634f54b03477c4e9c62c7cc92358c22b61853c33 WatchSource:0}: Error finding container ab3bd41ba536776dfb2cdbe1634f54b03477c4e9c62c7cc92358c22b61853c33: Status 404 returned error can't find the container with id ab3bd41ba536776dfb2cdbe1634f54b03477c4e9c62c7cc92358c22b61853c33 Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.612919 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.612919 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dnsbk" event={"ID":"6cfd762f-4be9-49a4-9851-f3211e11e6ad","Type":"ContainerDied","Data":"9f6fcc34e437806b6270f76476ffe445b9d2b9da9a75ae59e4b771d1714ab276"} Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.613498 4882 scope.go:117] "RemoveContainer" containerID="818b28340b1678d2be5b3cb17613fee6374ca8a6dd5566988b18415a113cbbd8" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.615384 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.615381 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth" event={"ID":"a182d0c0-45a9-450b-affc-44caf339abd8","Type":"ContainerDied","Data":"b66e511f441625a6fb0032834d0384a54a6f0bdecb4c1f639ba11c2c704d7022"} Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.617350 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" event={"ID":"0f9b718b-4236-4229-a787-af7839578f14","Type":"ContainerStarted","Data":"ab3bd41ba536776dfb2cdbe1634f54b03477c4e9c62c7cc92358c22b61853c33"} Oct 02 16:29:59 crc kubenswrapper[4882]: E1002 16:29:59.618982 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/bash:latest\\\"\"" pod="crc-storage/crc-storage-crc-7kz2r" podUID="9a3aa279-1314-4a0b-8418-8556d441d03d" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.630198 4882 scope.go:117] "RemoveContainer" containerID="7b07e2c779a6fbf4f475ce9cf1c343e6032cfd7a83cdca28a3cd301b468c1782" Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.664322 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth"] Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.668409 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mfzth"] Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.677435 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dnsbk"] Oct 02 16:29:59 crc kubenswrapper[4882]: I1002 16:29:59.681172 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dnsbk"] Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.137998 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n"] Oct 02 16:30:00 crc kubenswrapper[4882]: E1002 16:30:00.138289 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a182d0c0-45a9-450b-affc-44caf339abd8" containerName="route-controller-manager" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.138303 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a182d0c0-45a9-450b-affc-44caf339abd8" containerName="route-controller-manager" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.138434 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a182d0c0-45a9-450b-affc-44caf339abd8" containerName="route-controller-manager" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.138791 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.140793 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.143428 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.160640 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n"] Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.341725 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0c18c3-5fac-411e-979c-f65c58d99554-config-volume\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.341797 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpsr\" (UniqueName: \"kubernetes.io/projected/ce0c18c3-5fac-411e-979c-f65c58d99554-kube-api-access-4wpsr\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.342080 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0c18c3-5fac-411e-979c-f65c58d99554-secret-volume\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.443012 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0c18c3-5fac-411e-979c-f65c58d99554-secret-volume\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.443523 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0c18c3-5fac-411e-979c-f65c58d99554-config-volume\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.443568 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpsr\" (UniqueName: \"kubernetes.io/projected/ce0c18c3-5fac-411e-979c-f65c58d99554-kube-api-access-4wpsr\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.444781 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0c18c3-5fac-411e-979c-f65c58d99554-config-volume\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.451498 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0c18c3-5fac-411e-979c-f65c58d99554-secret-volume\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.466903 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wpsr\" (UniqueName: \"kubernetes.io/projected/ce0c18c3-5fac-411e-979c-f65c58d99554-kube-api-access-4wpsr\") pod \"collect-profiles-29323710-kz74n\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.625962 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" event={"ID":"0f9b718b-4236-4229-a787-af7839578f14","Type":"ContainerStarted","Data":"f6c2f0152b372baf1924f6ebf6ee882013ca276b826b4579f53e2ae5a727c916"} Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.626412 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.632396 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.671285 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b4b6654fd-ddx5h" podStartSLOduration=5.671005747 podStartE2EDuration="5.671005747s" podCreationTimestamp="2025-10-02 16:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:30:00.665282805 +0000 UTC m=+759.414512342" watchObservedRunningTime="2025-10-02 16:30:00.671005747 +0000 UTC m=+759.420235284" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.759503 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.768882 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfd762f-4be9-49a4-9851-f3211e11e6ad" path="/var/lib/kubelet/pods/6cfd762f-4be9-49a4-9851-f3211e11e6ad/volumes" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.769596 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a182d0c0-45a9-450b-affc-44caf339abd8" path="/var/lib/kubelet/pods/a182d0c0-45a9-450b-affc-44caf339abd8/volumes" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.958736 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m"] Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.959899 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.963083 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.963574 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.963751 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.963940 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.965945 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.971094 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m"] Oct 02 16:30:00 crc kubenswrapper[4882]: I1002 16:30:00.974264 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.152381 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djk8r\" (UniqueName: \"kubernetes.io/projected/5c422abe-ebae-43d8-9d25-8a0e42336391-kube-api-access-djk8r\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.152477 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c422abe-ebae-43d8-9d25-8a0e42336391-client-ca\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.152575 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c422abe-ebae-43d8-9d25-8a0e42336391-serving-cert\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.152779 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c422abe-ebae-43d8-9d25-8a0e42336391-config\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.179092 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n"] Oct 02 16:30:01 crc kubenswrapper[4882]: W1002 16:30:01.188608 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0c18c3_5fac_411e_979c_f65c58d99554.slice/crio-3e77242441cc86252304e41eab3823c21138aac9299bb8a56d4c3186182f2f93 WatchSource:0}: Error finding container 3e77242441cc86252304e41eab3823c21138aac9299bb8a56d4c3186182f2f93: Status 404 returned error can't find the container with id 3e77242441cc86252304e41eab3823c21138aac9299bb8a56d4c3186182f2f93 Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.254672 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c422abe-ebae-43d8-9d25-8a0e42336391-client-ca\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.254833 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c422abe-ebae-43d8-9d25-8a0e42336391-serving-cert\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.254906 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c422abe-ebae-43d8-9d25-8a0e42336391-config\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.254956 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djk8r\" (UniqueName: \"kubernetes.io/projected/5c422abe-ebae-43d8-9d25-8a0e42336391-kube-api-access-djk8r\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.256992 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c422abe-ebae-43d8-9d25-8a0e42336391-config\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.257276 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c422abe-ebae-43d8-9d25-8a0e42336391-client-ca\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.260773 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c422abe-ebae-43d8-9d25-8a0e42336391-serving-cert\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.272078 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djk8r\" (UniqueName: \"kubernetes.io/projected/5c422abe-ebae-43d8-9d25-8a0e42336391-kube-api-access-djk8r\") pod \"route-controller-manager-6d8887674-cwk5m\" (UID: \"5c422abe-ebae-43d8-9d25-8a0e42336391\") " pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.293036 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.515079 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m"] Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.635838 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" event={"ID":"ce0c18c3-5fac-411e-979c-f65c58d99554","Type":"ContainerStarted","Data":"d5cb8140333f64514ea9135c8c380c81f610c8a9d004e4b9751d8bfe0bb478dd"} Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.635902 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" event={"ID":"ce0c18c3-5fac-411e-979c-f65c58d99554","Type":"ContainerStarted","Data":"3e77242441cc86252304e41eab3823c21138aac9299bb8a56d4c3186182f2f93"} Oct 02 16:30:01 crc kubenswrapper[4882]: I1002 16:30:01.637838 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" event={"ID":"5c422abe-ebae-43d8-9d25-8a0e42336391","Type":"ContainerStarted","Data":"34884b12241060149f220cbda7b4840a64849fb678a8dc492e81e17f76e0901f"} Oct 02 16:30:01 crc kubenswrapper[4882]: E1002 16:30:01.895725 4882 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0c18c3_5fac_411e_979c_f65c58d99554.slice/crio-conmon-d5cb8140333f64514ea9135c8c380c81f610c8a9d004e4b9751d8bfe0bb478dd.scope\": RecentStats: unable to find data in memory cache]" Oct 02 16:30:02 crc kubenswrapper[4882]: I1002 16:30:02.645459 4882 generic.go:334] "Generic (PLEG): container finished" podID="ce0c18c3-5fac-411e-979c-f65c58d99554" containerID="d5cb8140333f64514ea9135c8c380c81f610c8a9d004e4b9751d8bfe0bb478dd" exitCode=0 Oct 02 16:30:02 crc kubenswrapper[4882]: I1002 16:30:02.645535 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" event={"ID":"ce0c18c3-5fac-411e-979c-f65c58d99554","Type":"ContainerDied","Data":"d5cb8140333f64514ea9135c8c380c81f610c8a9d004e4b9751d8bfe0bb478dd"} Oct 02 16:30:02 crc kubenswrapper[4882]: I1002 16:30:02.648258 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" event={"ID":"5c422abe-ebae-43d8-9d25-8a0e42336391","Type":"ContainerStarted","Data":"2be36a0d5bd59ad1423476068ad51809cda1db73b5eb6a95b02220000c447409"} Oct 02 16:30:02 crc kubenswrapper[4882]: I1002 16:30:02.648732 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:02 crc kubenswrapper[4882]: I1002 16:30:02.655141 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" Oct 02 16:30:03 crc kubenswrapper[4882]: I1002 16:30:03.895692 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:03 crc kubenswrapper[4882]: I1002 16:30:03.911113 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d8887674-cwk5m" podStartSLOduration=8.911082373 podStartE2EDuration="8.911082373s" podCreationTimestamp="2025-10-02 16:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:30:02.691141166 +0000 UTC m=+761.440370743" watchObservedRunningTime="2025-10-02 16:30:03.911082373 +0000 UTC m=+762.660311900" Oct 02 16:30:03 crc kubenswrapper[4882]: I1002 16:30:03.993623 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0c18c3-5fac-411e-979c-f65c58d99554-secret-volume\") pod \"ce0c18c3-5fac-411e-979c-f65c58d99554\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " Oct 02 16:30:03 crc kubenswrapper[4882]: I1002 16:30:03.993681 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0c18c3-5fac-411e-979c-f65c58d99554-config-volume\") pod \"ce0c18c3-5fac-411e-979c-f65c58d99554\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " Oct 02 16:30:03 crc kubenswrapper[4882]: I1002 16:30:03.993788 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wpsr\" (UniqueName: \"kubernetes.io/projected/ce0c18c3-5fac-411e-979c-f65c58d99554-kube-api-access-4wpsr\") pod \"ce0c18c3-5fac-411e-979c-f65c58d99554\" (UID: \"ce0c18c3-5fac-411e-979c-f65c58d99554\") " Oct 02 16:30:03 crc kubenswrapper[4882]: I1002 16:30:03.995386 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0c18c3-5fac-411e-979c-f65c58d99554-config-volume" (OuterVolumeSpecName: "config-volume") pod "ce0c18c3-5fac-411e-979c-f65c58d99554" (UID: "ce0c18c3-5fac-411e-979c-f65c58d99554"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:30:04 crc kubenswrapper[4882]: I1002 16:30:04.000929 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0c18c3-5fac-411e-979c-f65c58d99554-kube-api-access-4wpsr" (OuterVolumeSpecName: "kube-api-access-4wpsr") pod "ce0c18c3-5fac-411e-979c-f65c58d99554" (UID: "ce0c18c3-5fac-411e-979c-f65c58d99554"). InnerVolumeSpecName "kube-api-access-4wpsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:30:04 crc kubenswrapper[4882]: I1002 16:30:04.002013 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c18c3-5fac-411e-979c-f65c58d99554-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ce0c18c3-5fac-411e-979c-f65c58d99554" (UID: "ce0c18c3-5fac-411e-979c-f65c58d99554"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:30:04 crc kubenswrapper[4882]: I1002 16:30:04.095889 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wpsr\" (UniqueName: \"kubernetes.io/projected/ce0c18c3-5fac-411e-979c-f65c58d99554-kube-api-access-4wpsr\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:04 crc kubenswrapper[4882]: I1002 16:30:04.095925 4882 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce0c18c3-5fac-411e-979c-f65c58d99554-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:04 crc kubenswrapper[4882]: I1002 16:30:04.095938 4882 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce0c18c3-5fac-411e-979c-f65c58d99554-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:04 crc kubenswrapper[4882]: I1002 16:30:04.662246 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" Oct 02 16:30:04 crc kubenswrapper[4882]: I1002 16:30:04.662274 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323710-kz74n" event={"ID":"ce0c18c3-5fac-411e-979c-f65c58d99554","Type":"ContainerDied","Data":"3e77242441cc86252304e41eab3823c21138aac9299bb8a56d4c3186182f2f93"} Oct 02 16:30:04 crc kubenswrapper[4882]: I1002 16:30:04.662324 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e77242441cc86252304e41eab3823c21138aac9299bb8a56d4c3186182f2f93" Oct 02 16:30:09 crc kubenswrapper[4882]: I1002 16:30:09.390719 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:30:09 crc kubenswrapper[4882]: I1002 16:30:09.391104 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:30:17 crc kubenswrapper[4882]: I1002 16:30:17.748703 4882 generic.go:334] "Generic (PLEG): container finished" podID="9a3aa279-1314-4a0b-8418-8556d441d03d" containerID="21e25f32f58edb8c21fc6af5cae89c037514125b951e03c60fbd0e2e9e0baba7" exitCode=0 Oct 02 16:30:17 crc kubenswrapper[4882]: I1002 16:30:17.748814 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7kz2r" event={"ID":"9a3aa279-1314-4a0b-8418-8556d441d03d","Type":"ContainerDied","Data":"21e25f32f58edb8c21fc6af5cae89c037514125b951e03c60fbd0e2e9e0baba7"} Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.083453 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.105717 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt6gk\" (UniqueName: \"kubernetes.io/projected/9a3aa279-1314-4a0b-8418-8556d441d03d-kube-api-access-mt6gk\") pod \"9a3aa279-1314-4a0b-8418-8556d441d03d\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.105804 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a3aa279-1314-4a0b-8418-8556d441d03d-crc-storage\") pod \"9a3aa279-1314-4a0b-8418-8556d441d03d\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.105927 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a3aa279-1314-4a0b-8418-8556d441d03d-node-mnt\") pod \"9a3aa279-1314-4a0b-8418-8556d441d03d\" (UID: \"9a3aa279-1314-4a0b-8418-8556d441d03d\") " Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.106194 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a3aa279-1314-4a0b-8418-8556d441d03d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9a3aa279-1314-4a0b-8418-8556d441d03d" (UID: "9a3aa279-1314-4a0b-8418-8556d441d03d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.113565 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3aa279-1314-4a0b-8418-8556d441d03d-kube-api-access-mt6gk" (OuterVolumeSpecName: "kube-api-access-mt6gk") pod "9a3aa279-1314-4a0b-8418-8556d441d03d" (UID: "9a3aa279-1314-4a0b-8418-8556d441d03d"). InnerVolumeSpecName "kube-api-access-mt6gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.124116 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3aa279-1314-4a0b-8418-8556d441d03d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9a3aa279-1314-4a0b-8418-8556d441d03d" (UID: "9a3aa279-1314-4a0b-8418-8556d441d03d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.207549 4882 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9a3aa279-1314-4a0b-8418-8556d441d03d-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.207813 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt6gk\" (UniqueName: \"kubernetes.io/projected/9a3aa279-1314-4a0b-8418-8556d441d03d-kube-api-access-mt6gk\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.207872 4882 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9a3aa279-1314-4a0b-8418-8556d441d03d-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.760425 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7kz2r" event={"ID":"9a3aa279-1314-4a0b-8418-8556d441d03d","Type":"ContainerDied","Data":"edcb3b7943bd4fe633dffbab17ce8100e62086e6f76331e0b300484a431a3ebe"} Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.760474 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edcb3b7943bd4fe633dffbab17ce8100e62086e6f76331e0b300484a431a3ebe" Oct 02 16:30:19 crc kubenswrapper[4882]: I1002 16:30:19.760531 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7kz2r" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.347153 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488"] Oct 02 16:30:27 crc kubenswrapper[4882]: E1002 16:30:27.348031 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0c18c3-5fac-411e-979c-f65c58d99554" containerName="collect-profiles" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.348051 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0c18c3-5fac-411e-979c-f65c58d99554" containerName="collect-profiles" Oct 02 16:30:27 crc kubenswrapper[4882]: E1002 16:30:27.348080 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3aa279-1314-4a0b-8418-8556d441d03d" containerName="storage" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.348088 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3aa279-1314-4a0b-8418-8556d441d03d" containerName="storage" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.348202 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0c18c3-5fac-411e-979c-f65c58d99554" containerName="collect-profiles" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.348245 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3aa279-1314-4a0b-8418-8556d441d03d" containerName="storage" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.349140 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.351252 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.359802 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488"] Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.411064 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.411128 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.411170 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p67bd\" (UniqueName: \"kubernetes.io/projected/bf113633-1b94-4991-89f2-7418c97f620f-kube-api-access-p67bd\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.512555 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.512648 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.512704 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p67bd\" (UniqueName: \"kubernetes.io/projected/bf113633-1b94-4991-89f2-7418c97f620f-kube-api-access-p67bd\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.513086 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.513494 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.531860 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p67bd\" (UniqueName: \"kubernetes.io/projected/bf113633-1b94-4991-89f2-7418c97f620f-kube-api-access-p67bd\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:27 crc kubenswrapper[4882]: I1002 16:30:27.666849 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:28 crc kubenswrapper[4882]: I1002 16:30:28.065674 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488"] Oct 02 16:30:28 crc kubenswrapper[4882]: I1002 16:30:28.813130 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" event={"ID":"bf113633-1b94-4991-89f2-7418c97f620f","Type":"ContainerStarted","Data":"ca3533c5a6b1761d6933ba555d958ec1abf3b3fefe166605997248dfa2e36a61"} Oct 02 16:30:28 crc kubenswrapper[4882]: I1002 16:30:28.813180 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" event={"ID":"bf113633-1b94-4991-89f2-7418c97f620f","Type":"ContainerStarted","Data":"7d702ada694071870fe54546a49027e2624deabff39377b6cfbd70e6111a79b2"} Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.699270 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4qkg4"] Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.700621 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.713826 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qkg4"] Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.761299 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-utilities\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.761348 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbf82\" (UniqueName: \"kubernetes.io/projected/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-kube-api-access-kbf82\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.761378 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-catalog-content\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.819630 4882 generic.go:334] "Generic (PLEG): container finished" podID="bf113633-1b94-4991-89f2-7418c97f620f" containerID="ca3533c5a6b1761d6933ba555d958ec1abf3b3fefe166605997248dfa2e36a61" exitCode=0 Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.819690 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" event={"ID":"bf113633-1b94-4991-89f2-7418c97f620f","Type":"ContainerDied","Data":"ca3533c5a6b1761d6933ba555d958ec1abf3b3fefe166605997248dfa2e36a61"} Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.862823 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-utilities\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.862902 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbf82\" (UniqueName: \"kubernetes.io/projected/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-kube-api-access-kbf82\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.862952 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-catalog-content\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.863883 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-catalog-content\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.863945 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-utilities\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:29 crc kubenswrapper[4882]: I1002 16:30:29.884881 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbf82\" (UniqueName: \"kubernetes.io/projected/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-kube-api-access-kbf82\") pod \"redhat-operators-4qkg4\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:30 crc kubenswrapper[4882]: I1002 16:30:30.071966 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:30 crc kubenswrapper[4882]: I1002 16:30:30.489843 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qkg4"] Oct 02 16:30:30 crc kubenswrapper[4882]: W1002 16:30:30.500864 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d3fbc5_23ba_44e7_aba3_089b68e6d2c0.slice/crio-1a90dbebbf8b4218a7a22c7ad99d54104060a552c51cc2105bc3571295ce2113 WatchSource:0}: Error finding container 1a90dbebbf8b4218a7a22c7ad99d54104060a552c51cc2105bc3571295ce2113: Status 404 returned error can't find the container with id 1a90dbebbf8b4218a7a22c7ad99d54104060a552c51cc2105bc3571295ce2113 Oct 02 16:30:30 crc kubenswrapper[4882]: I1002 16:30:30.826023 4882 generic.go:334] "Generic (PLEG): container finished" podID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerID="bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6" exitCode=0 Oct 02 16:30:30 crc kubenswrapper[4882]: I1002 16:30:30.826071 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qkg4" event={"ID":"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0","Type":"ContainerDied","Data":"bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6"} Oct 02 16:30:30 crc kubenswrapper[4882]: I1002 16:30:30.826100 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qkg4" event={"ID":"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0","Type":"ContainerStarted","Data":"1a90dbebbf8b4218a7a22c7ad99d54104060a552c51cc2105bc3571295ce2113"} Oct 02 16:30:31 crc kubenswrapper[4882]: I1002 16:30:31.835117 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qkg4" event={"ID":"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0","Type":"ContainerStarted","Data":"b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa"} Oct 02 16:30:32 crc kubenswrapper[4882]: E1002 16:30:32.270295 4882 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d3fbc5_23ba_44e7_aba3_089b68e6d2c0.slice/crio-b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa.scope\": RecentStats: unable to find data in memory cache]" Oct 02 16:30:32 crc kubenswrapper[4882]: I1002 16:30:32.842859 4882 generic.go:334] "Generic (PLEG): container finished" podID="bf113633-1b94-4991-89f2-7418c97f620f" containerID="8b019520610b093b5509585b27a8788c7c26785ac37fd96b4485f8eda08b17e7" exitCode=0 Oct 02 16:30:32 crc kubenswrapper[4882]: I1002 16:30:32.842961 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" event={"ID":"bf113633-1b94-4991-89f2-7418c97f620f","Type":"ContainerDied","Data":"8b019520610b093b5509585b27a8788c7c26785ac37fd96b4485f8eda08b17e7"} Oct 02 16:30:32 crc kubenswrapper[4882]: I1002 16:30:32.846909 4882 generic.go:334] "Generic (PLEG): container finished" podID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerID="b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa" exitCode=0 Oct 02 16:30:32 crc kubenswrapper[4882]: I1002 16:30:32.847301 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qkg4" event={"ID":"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0","Type":"ContainerDied","Data":"b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa"} Oct 02 16:30:33 crc kubenswrapper[4882]: I1002 16:30:33.858755 4882 generic.go:334] "Generic (PLEG): container finished" podID="bf113633-1b94-4991-89f2-7418c97f620f" containerID="fa80b550d7e607fbd5a046a9033303a1ca3643f5b8e56d43b0d4aaa3f7559839" exitCode=0 Oct 02 16:30:33 crc kubenswrapper[4882]: I1002 16:30:33.858899 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" event={"ID":"bf113633-1b94-4991-89f2-7418c97f620f","Type":"ContainerDied","Data":"fa80b550d7e607fbd5a046a9033303a1ca3643f5b8e56d43b0d4aaa3f7559839"} Oct 02 16:30:33 crc kubenswrapper[4882]: I1002 16:30:33.864284 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qkg4" event={"ID":"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0","Type":"ContainerStarted","Data":"29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57"} Oct 02 16:30:33 crc kubenswrapper[4882]: I1002 16:30:33.898284 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4qkg4" podStartSLOduration=2.396228446 podStartE2EDuration="4.898262553s" podCreationTimestamp="2025-10-02 16:30:29 +0000 UTC" firstStartedPulling="2025-10-02 16:30:30.827700139 +0000 UTC m=+789.576929676" lastFinishedPulling="2025-10-02 16:30:33.329734256 +0000 UTC m=+792.078963783" observedRunningTime="2025-10-02 16:30:33.895994652 +0000 UTC m=+792.645224189" watchObservedRunningTime="2025-10-02 16:30:33.898262553 +0000 UTC m=+792.647492090" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.162519 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.328466 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p67bd\" (UniqueName: \"kubernetes.io/projected/bf113633-1b94-4991-89f2-7418c97f620f-kube-api-access-p67bd\") pod \"bf113633-1b94-4991-89f2-7418c97f620f\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.328612 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-bundle\") pod \"bf113633-1b94-4991-89f2-7418c97f620f\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.328766 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-util\") pod \"bf113633-1b94-4991-89f2-7418c97f620f\" (UID: \"bf113633-1b94-4991-89f2-7418c97f620f\") " Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.329343 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-bundle" (OuterVolumeSpecName: "bundle") pod "bf113633-1b94-4991-89f2-7418c97f620f" (UID: "bf113633-1b94-4991-89f2-7418c97f620f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.337364 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf113633-1b94-4991-89f2-7418c97f620f-kube-api-access-p67bd" (OuterVolumeSpecName: "kube-api-access-p67bd") pod "bf113633-1b94-4991-89f2-7418c97f620f" (UID: "bf113633-1b94-4991-89f2-7418c97f620f"). InnerVolumeSpecName "kube-api-access-p67bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.338985 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-util" (OuterVolumeSpecName: "util") pod "bf113633-1b94-4991-89f2-7418c97f620f" (UID: "bf113633-1b94-4991-89f2-7418c97f620f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.430641 4882 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.430691 4882 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf113633-1b94-4991-89f2-7418c97f620f-util\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.430701 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p67bd\" (UniqueName: \"kubernetes.io/projected/bf113633-1b94-4991-89f2-7418c97f620f-kube-api-access-p67bd\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.875071 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" event={"ID":"bf113633-1b94-4991-89f2-7418c97f620f","Type":"ContainerDied","Data":"7d702ada694071870fe54546a49027e2624deabff39377b6cfbd70e6111a79b2"} Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.875131 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d702ada694071870fe54546a49027e2624deabff39377b6cfbd70e6111a79b2" Oct 02 16:30:35 crc kubenswrapper[4882]: I1002 16:30:35.875176 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.925006 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-llkzt"] Oct 02 16:30:38 crc kubenswrapper[4882]: E1002 16:30:38.926455 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf113633-1b94-4991-89f2-7418c97f620f" containerName="pull" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.926524 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf113633-1b94-4991-89f2-7418c97f620f" containerName="pull" Oct 02 16:30:38 crc kubenswrapper[4882]: E1002 16:30:38.926597 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf113633-1b94-4991-89f2-7418c97f620f" containerName="util" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.926652 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf113633-1b94-4991-89f2-7418c97f620f" containerName="util" Oct 02 16:30:38 crc kubenswrapper[4882]: E1002 16:30:38.926706 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf113633-1b94-4991-89f2-7418c97f620f" containerName="extract" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.926753 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf113633-1b94-4991-89f2-7418c97f620f" containerName="extract" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.926886 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf113633-1b94-4991-89f2-7418c97f620f" containerName="extract" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.927328 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-llkzt" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.929376 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.929671 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.930540 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-blll8" Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.938132 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-llkzt"] Oct 02 16:30:38 crc kubenswrapper[4882]: I1002 16:30:38.974973 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgk2t\" (UniqueName: \"kubernetes.io/projected/c0a1d391-cb92-485c-b8a1-55f6383284af-kube-api-access-kgk2t\") pod \"nmstate-operator-858ddd8f98-llkzt\" (UID: \"c0a1d391-cb92-485c-b8a1-55f6383284af\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-llkzt" Oct 02 16:30:39 crc kubenswrapper[4882]: I1002 16:30:39.075979 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgk2t\" (UniqueName: \"kubernetes.io/projected/c0a1d391-cb92-485c-b8a1-55f6383284af-kube-api-access-kgk2t\") pod \"nmstate-operator-858ddd8f98-llkzt\" (UID: \"c0a1d391-cb92-485c-b8a1-55f6383284af\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-llkzt" Oct 02 16:30:39 crc kubenswrapper[4882]: I1002 16:30:39.102320 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgk2t\" (UniqueName: \"kubernetes.io/projected/c0a1d391-cb92-485c-b8a1-55f6383284af-kube-api-access-kgk2t\") pod \"nmstate-operator-858ddd8f98-llkzt\" (UID: \"c0a1d391-cb92-485c-b8a1-55f6383284af\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-llkzt" Oct 02 16:30:39 crc kubenswrapper[4882]: I1002 16:30:39.244619 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-llkzt" Oct 02 16:30:39 crc kubenswrapper[4882]: I1002 16:30:39.390970 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:30:39 crc kubenswrapper[4882]: I1002 16:30:39.391514 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:30:39 crc kubenswrapper[4882]: I1002 16:30:39.673972 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-llkzt"] Oct 02 16:30:39 crc kubenswrapper[4882]: I1002 16:30:39.896690 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-llkzt" event={"ID":"c0a1d391-cb92-485c-b8a1-55f6383284af","Type":"ContainerStarted","Data":"76b45bbd1cee621ea63b48e35123065f955e458a5591e9a956ffc9bf8b52003a"} Oct 02 16:30:40 crc kubenswrapper[4882]: I1002 16:30:40.073155 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:40 crc kubenswrapper[4882]: I1002 16:30:40.073235 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:40 crc kubenswrapper[4882]: I1002 16:30:40.124714 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:40 crc kubenswrapper[4882]: I1002 16:30:40.945003 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:42 crc kubenswrapper[4882]: I1002 16:30:42.484451 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qkg4"] Oct 02 16:30:42 crc kubenswrapper[4882]: I1002 16:30:42.914998 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-llkzt" event={"ID":"c0a1d391-cb92-485c-b8a1-55f6383284af","Type":"ContainerStarted","Data":"3d01b5b8bafab44659683d415ad0bc08974caa1632b937e51deda0d6a870c12d"} Oct 02 16:30:42 crc kubenswrapper[4882]: I1002 16:30:42.915105 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4qkg4" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerName="registry-server" containerID="cri-o://29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57" gracePeriod=2 Oct 02 16:30:42 crc kubenswrapper[4882]: I1002 16:30:42.938682 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-llkzt" podStartSLOduration=2.115117281 podStartE2EDuration="4.938659187s" podCreationTimestamp="2025-10-02 16:30:38 +0000 UTC" firstStartedPulling="2025-10-02 16:30:39.683641446 +0000 UTC m=+798.432870973" lastFinishedPulling="2025-10-02 16:30:42.507183352 +0000 UTC m=+801.256412879" observedRunningTime="2025-10-02 16:30:42.93533784 +0000 UTC m=+801.684567367" watchObservedRunningTime="2025-10-02 16:30:42.938659187 +0000 UTC m=+801.687888724" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.304273 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.330685 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-utilities\") pod \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.330750 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbf82\" (UniqueName: \"kubernetes.io/projected/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-kube-api-access-kbf82\") pod \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.330783 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-catalog-content\") pod \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\" (UID: \"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0\") " Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.332110 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-utilities" (OuterVolumeSpecName: "utilities") pod "e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" (UID: "e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.340495 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-kube-api-access-kbf82" (OuterVolumeSpecName: "kube-api-access-kbf82") pod "e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" (UID: "e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0"). InnerVolumeSpecName "kube-api-access-kbf82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.406638 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" (UID: "e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.433569 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.433639 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbf82\" (UniqueName: \"kubernetes.io/projected/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-kube-api-access-kbf82\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.433656 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.922512 4882 generic.go:334] "Generic (PLEG): container finished" podID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerID="29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57" exitCode=0 Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.922585 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qkg4" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.922616 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qkg4" event={"ID":"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0","Type":"ContainerDied","Data":"29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57"} Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.922655 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qkg4" event={"ID":"e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0","Type":"ContainerDied","Data":"1a90dbebbf8b4218a7a22c7ad99d54104060a552c51cc2105bc3571295ce2113"} Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.922677 4882 scope.go:117] "RemoveContainer" containerID="29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.942067 4882 scope.go:117] "RemoveContainer" containerID="b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.959293 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qkg4"] Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.961809 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4qkg4"] Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.974880 4882 scope.go:117] "RemoveContainer" containerID="bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.989443 4882 scope.go:117] "RemoveContainer" containerID="29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57" Oct 02 16:30:43 crc kubenswrapper[4882]: E1002 16:30:43.989917 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57\": container with ID starting with 29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57 not found: ID does not exist" containerID="29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.989983 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57"} err="failed to get container status \"29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57\": rpc error: code = NotFound desc = could not find container \"29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57\": container with ID starting with 29298466febdbeca9fc518590acdfa0183b53d743fe152538799c83dd323aa57 not found: ID does not exist" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.990032 4882 scope.go:117] "RemoveContainer" containerID="b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa" Oct 02 16:30:43 crc kubenswrapper[4882]: E1002 16:30:43.990510 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa\": container with ID starting with b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa not found: ID does not exist" containerID="b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.990557 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa"} err="failed to get container status \"b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa\": rpc error: code = NotFound desc = could not find container \"b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa\": container with ID starting with b386d175a97497d89ce59e04c9f5f66e4cf58a2a71e5a31d7af0f50fbd63f0aa not found: ID does not exist" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.990592 4882 scope.go:117] "RemoveContainer" containerID="bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6" Oct 02 16:30:43 crc kubenswrapper[4882]: E1002 16:30:43.990839 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6\": container with ID starting with bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6 not found: ID does not exist" containerID="bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6" Oct 02 16:30:43 crc kubenswrapper[4882]: I1002 16:30:43.990867 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6"} err="failed to get container status \"bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6\": rpc error: code = NotFound desc = could not find container \"bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6\": container with ID starting with bd3331c22551ff8b87506effc4ddd284fad4db1a630f5974f2202dd5df7749e6 not found: ID does not exist" Oct 02 16:30:44 crc kubenswrapper[4882]: I1002 16:30:44.772084 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" path="/var/lib/kubelet/pods/e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0/volumes" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.909541 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j"] Oct 02 16:30:47 crc kubenswrapper[4882]: E1002 16:30:47.910372 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerName="registry-server" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.910390 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerName="registry-server" Oct 02 16:30:47 crc kubenswrapper[4882]: E1002 16:30:47.910402 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerName="extract-content" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.910408 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerName="extract-content" Oct 02 16:30:47 crc kubenswrapper[4882]: E1002 16:30:47.910430 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerName="extract-utilities" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.910438 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerName="extract-utilities" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.910574 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d3fbc5-23ba-44e7-aba3-089b68e6d2c0" containerName="registry-server" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.911372 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" Oct 02 16:30:47 crc kubenswrapper[4882]: W1002 16:30:47.914493 4882 reflector.go:561] object-"openshift-nmstate"/"nmstate-handler-dockercfg-8c5x2": failed to list *v1.Secret: secrets "nmstate-handler-dockercfg-8c5x2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Oct 02 16:30:47 crc kubenswrapper[4882]: E1002 16:30:47.914569 4882 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-8c5x2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nmstate-handler-dockercfg-8c5x2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.920977 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp"] Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.921896 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.925199 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.927298 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j"] Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.963629 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-9gswz"] Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.964399 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.973549 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp"] Oct 02 16:30:47 crc kubenswrapper[4882]: I1002 16:30:47.995017 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph46p\" (UniqueName: \"kubernetes.io/projected/fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e-kube-api-access-ph46p\") pod \"nmstate-metrics-fdff9cb8d-mf42j\" (UID: \"fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.087657 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf"] Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.088800 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.090949 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.091607 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4sn4f" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.091781 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.096107 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-nmstate-lock\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.096165 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-ovs-socket\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.096197 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdfb\" (UniqueName: \"kubernetes.io/projected/aede390b-5a93-41e4-b404-29edaf0d16c9-kube-api-access-rcdfb\") pod \"nmstate-webhook-6cdbc54649-skbdp\" (UID: \"aede390b-5a93-41e4-b404-29edaf0d16c9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.096340 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aede390b-5a93-41e4-b404-29edaf0d16c9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-skbdp\" (UID: \"aede390b-5a93-41e4-b404-29edaf0d16c9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.096401 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmzj\" (UniqueName: \"kubernetes.io/projected/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-kube-api-access-dnmzj\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.096578 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-dbus-socket\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.096651 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph46p\" (UniqueName: \"kubernetes.io/projected/fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e-kube-api-access-ph46p\") pod \"nmstate-metrics-fdff9cb8d-mf42j\" (UID: \"fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.097829 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf"] Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.118119 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph46p\" (UniqueName: \"kubernetes.io/projected/fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e-kube-api-access-ph46p\") pod \"nmstate-metrics-fdff9cb8d-mf42j\" (UID: \"fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198523 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-dbus-socket\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198604 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-nmstate-lock\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198642 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a304df3f-c207-4b0e-93e4-62f68ae75159-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198672 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-ovs-socket\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198701 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdfb\" (UniqueName: \"kubernetes.io/projected/aede390b-5a93-41e4-b404-29edaf0d16c9-kube-api-access-rcdfb\") pod \"nmstate-webhook-6cdbc54649-skbdp\" (UID: \"aede390b-5a93-41e4-b404-29edaf0d16c9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198745 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nghn\" (UniqueName: \"kubernetes.io/projected/a304df3f-c207-4b0e-93e4-62f68ae75159-kube-api-access-5nghn\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198777 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aede390b-5a93-41e4-b404-29edaf0d16c9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-skbdp\" (UID: \"aede390b-5a93-41e4-b404-29edaf0d16c9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198801 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a304df3f-c207-4b0e-93e4-62f68ae75159-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198829 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmzj\" (UniqueName: \"kubernetes.io/projected/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-kube-api-access-dnmzj\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198870 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-nmstate-lock\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198950 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-dbus-socket\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.198902 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-ovs-socket\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.211397 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aede390b-5a93-41e4-b404-29edaf0d16c9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-skbdp\" (UID: \"aede390b-5a93-41e4-b404-29edaf0d16c9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.221470 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdfb\" (UniqueName: \"kubernetes.io/projected/aede390b-5a93-41e4-b404-29edaf0d16c9-kube-api-access-rcdfb\") pod \"nmstate-webhook-6cdbc54649-skbdp\" (UID: \"aede390b-5a93-41e4-b404-29edaf0d16c9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.227704 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmzj\" (UniqueName: \"kubernetes.io/projected/0239ce1d-4e1a-4ffa-b9b3-654250a881b8-kube-api-access-dnmzj\") pod \"nmstate-handler-9gswz\" (UID: \"0239ce1d-4e1a-4ffa-b9b3-654250a881b8\") " pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.300053 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nghn\" (UniqueName: \"kubernetes.io/projected/a304df3f-c207-4b0e-93e4-62f68ae75159-kube-api-access-5nghn\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.300132 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a304df3f-c207-4b0e-93e4-62f68ae75159-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.300234 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a304df3f-c207-4b0e-93e4-62f68ae75159-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.301480 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a304df3f-c207-4b0e-93e4-62f68ae75159-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.306580 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ff46c6c9c-l4lbs"] Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.307387 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.316124 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a304df3f-c207-4b0e-93e4-62f68ae75159-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.333314 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ff46c6c9c-l4lbs"] Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.334168 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nghn\" (UniqueName: \"kubernetes.io/projected/a304df3f-c207-4b0e-93e4-62f68ae75159-kube-api-access-5nghn\") pod \"nmstate-console-plugin-6b874cbd85-p26rf\" (UID: \"a304df3f-c207-4b0e-93e4-62f68ae75159\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.401988 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-serving-cert\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.402050 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sbl9\" (UniqueName: \"kubernetes.io/projected/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-kube-api-access-7sbl9\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.402080 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-config\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.402103 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-oauth-config\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.402118 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-trusted-ca-bundle\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.402178 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-oauth-serving-cert\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.402236 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-service-ca\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.409967 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.503136 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-serving-cert\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.503200 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sbl9\" (UniqueName: \"kubernetes.io/projected/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-kube-api-access-7sbl9\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.503319 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-config\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.503353 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-oauth-config\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.503522 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-trusted-ca-bundle\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.503548 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-oauth-serving-cert\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.503596 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-service-ca\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.504541 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-config\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.504919 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-service-ca\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.504928 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-oauth-serving-cert\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.505073 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-trusted-ca-bundle\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.510148 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-serving-cert\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.510762 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-console-oauth-config\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.525730 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sbl9\" (UniqueName: \"kubernetes.io/projected/2a05fc3f-e9ae-43ec-9a4e-bab1356d138b-kube-api-access-7sbl9\") pod \"console-6ff46c6c9c-l4lbs\" (UID: \"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b\") " pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.667849 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.702583 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pl7tg"] Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.703903 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.705902 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl7tg"] Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.809245 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-utilities\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.809322 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-catalog-content\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.809492 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdlsm\" (UniqueName: \"kubernetes.io/projected/8822b1a5-437e-49aa-a712-1882510ea1d1-kube-api-access-qdlsm\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.852033 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf"] Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.904076 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ff46c6c9c-l4lbs"] Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.911076 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-utilities\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.911154 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-catalog-content\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.911277 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdlsm\" (UniqueName: \"kubernetes.io/projected/8822b1a5-437e-49aa-a712-1882510ea1d1-kube-api-access-qdlsm\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.912497 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-utilities\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.912540 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-catalog-content\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: W1002 16:30:48.917928 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a05fc3f_e9ae_43ec_9a4e_bab1356d138b.slice/crio-542460b934c99efab67b5a45e7d5f00584e1c9cb1f1b850ff2254151f1fcde15 WatchSource:0}: Error finding container 542460b934c99efab67b5a45e7d5f00584e1c9cb1f1b850ff2254151f1fcde15: Status 404 returned error can't find the container with id 542460b934c99efab67b5a45e7d5f00584e1c9cb1f1b850ff2254151f1fcde15 Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.935751 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdlsm\" (UniqueName: \"kubernetes.io/projected/8822b1a5-437e-49aa-a712-1882510ea1d1-kube-api-access-qdlsm\") pod \"redhat-marketplace-pl7tg\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.973858 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" event={"ID":"a304df3f-c207-4b0e-93e4-62f68ae75159","Type":"ContainerStarted","Data":"ff6dec9fc6dd03e9e0deba3e9aa991b7524a7830d6314b884c1d14cfb2cd3360"} Oct 02 16:30:48 crc kubenswrapper[4882]: I1002 16:30:48.976193 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ff46c6c9c-l4lbs" event={"ID":"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b","Type":"ContainerStarted","Data":"542460b934c99efab67b5a45e7d5f00584e1c9cb1f1b850ff2254151f1fcde15"} Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.028924 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.141422 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-8c5x2" Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.149066 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.151439 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.151582 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.572079 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl7tg"] Oct 02 16:30:49 crc kubenswrapper[4882]: W1002 16:30:49.577686 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8822b1a5_437e_49aa_a712_1882510ea1d1.slice/crio-bc4a40c1ce77ec7a947eb2e32a5c7edea92f2adae89441c305995a505e8c333f WatchSource:0}: Error finding container bc4a40c1ce77ec7a947eb2e32a5c7edea92f2adae89441c305995a505e8c333f: Status 404 returned error can't find the container with id bc4a40c1ce77ec7a947eb2e32a5c7edea92f2adae89441c305995a505e8c333f Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.639388 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp"] Oct 02 16:30:49 crc kubenswrapper[4882]: W1002 16:30:49.646518 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaede390b_5a93_41e4_b404_29edaf0d16c9.slice/crio-f5a3293b12f81d6e299b7ca04757dee77f7315d819eb6ef6af9e456781d07441 WatchSource:0}: Error finding container f5a3293b12f81d6e299b7ca04757dee77f7315d819eb6ef6af9e456781d07441: Status 404 returned error can't find the container with id f5a3293b12f81d6e299b7ca04757dee77f7315d819eb6ef6af9e456781d07441 Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.683713 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j"] Oct 02 16:30:49 crc kubenswrapper[4882]: W1002 16:30:49.696116 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe54acc0_8f5b_4a7a_8fd9_c8dec2c4ca3e.slice/crio-28f59dc1d85dbc6ac90c1b74c0f749bb24f8598e70bcab5b39196110ac5debc7 WatchSource:0}: Error finding container 28f59dc1d85dbc6ac90c1b74c0f749bb24f8598e70bcab5b39196110ac5debc7: Status 404 returned error can't find the container with id 28f59dc1d85dbc6ac90c1b74c0f749bb24f8598e70bcab5b39196110ac5debc7 Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.981947 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ff46c6c9c-l4lbs" event={"ID":"2a05fc3f-e9ae-43ec-9a4e-bab1356d138b","Type":"ContainerStarted","Data":"f46ddbc81588dbb0bef11467c0b31d124e66acbc7eaabf257539943321e41162"} Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.985979 4882 generic.go:334] "Generic (PLEG): container finished" podID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerID="fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc" exitCode=0 Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.986074 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl7tg" event={"ID":"8822b1a5-437e-49aa-a712-1882510ea1d1","Type":"ContainerDied","Data":"fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc"} Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.986101 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl7tg" event={"ID":"8822b1a5-437e-49aa-a712-1882510ea1d1","Type":"ContainerStarted","Data":"bc4a40c1ce77ec7a947eb2e32a5c7edea92f2adae89441c305995a505e8c333f"} Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.987102 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9gswz" event={"ID":"0239ce1d-4e1a-4ffa-b9b3-654250a881b8","Type":"ContainerStarted","Data":"3d42fa57a55d34af4995d50558f2da25411ad396342a52a04671a8685ddf7602"} Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.988149 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" event={"ID":"aede390b-5a93-41e4-b404-29edaf0d16c9","Type":"ContainerStarted","Data":"f5a3293b12f81d6e299b7ca04757dee77f7315d819eb6ef6af9e456781d07441"} Oct 02 16:30:49 crc kubenswrapper[4882]: I1002 16:30:49.989517 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" event={"ID":"fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e","Type":"ContainerStarted","Data":"28f59dc1d85dbc6ac90c1b74c0f749bb24f8598e70bcab5b39196110ac5debc7"} Oct 02 16:30:50 crc kubenswrapper[4882]: I1002 16:30:50.008459 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ff46c6c9c-l4lbs" podStartSLOduration=2.008435263 podStartE2EDuration="2.008435263s" podCreationTimestamp="2025-10-02 16:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:30:50.006278133 +0000 UTC m=+808.755507660" watchObservedRunningTime="2025-10-02 16:30:50.008435263 +0000 UTC m=+808.757664790" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.036246 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" event={"ID":"fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e","Type":"ContainerStarted","Data":"58e222ac3c7cac179fce6bee50a37eda5469d39766578daecbe4920e33f61bed"} Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.038757 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" event={"ID":"a304df3f-c207-4b0e-93e4-62f68ae75159","Type":"ContainerStarted","Data":"e72f9da973d63ae65b9a29fa40672bad52232f0fb05996e9cf13c9baee3fb060"} Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.041902 4882 generic.go:334] "Generic (PLEG): container finished" podID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerID="b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e" exitCode=0 Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.041962 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl7tg" event={"ID":"8822b1a5-437e-49aa-a712-1882510ea1d1","Type":"ContainerDied","Data":"b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e"} Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.045079 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9gswz" event={"ID":"0239ce1d-4e1a-4ffa-b9b3-654250a881b8","Type":"ContainerStarted","Data":"9518a29b5a762fc4c2baed7ce63c20c604f2af9661f003382f232c0334946a78"} Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.045170 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.046987 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" event={"ID":"aede390b-5a93-41e4-b404-29edaf0d16c9","Type":"ContainerStarted","Data":"d9a63462d7a9d333bbd9555710a078d561861dfffc5120620a245af94faa2fdc"} Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.047152 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.058087 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-p26rf" podStartSLOduration=1.9836137809999999 podStartE2EDuration="8.058046962s" podCreationTimestamp="2025-10-02 16:30:48 +0000 UTC" firstStartedPulling="2025-10-02 16:30:48.887763297 +0000 UTC m=+807.636992824" lastFinishedPulling="2025-10-02 16:30:54.962196478 +0000 UTC m=+813.711426005" observedRunningTime="2025-10-02 16:30:56.054721806 +0000 UTC m=+814.803951333" watchObservedRunningTime="2025-10-02 16:30:56.058046962 +0000 UTC m=+814.807276489" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.103015 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" podStartSLOduration=3.688662197 podStartE2EDuration="9.102992399s" podCreationTimestamp="2025-10-02 16:30:47 +0000 UTC" firstStartedPulling="2025-10-02 16:30:49.650423453 +0000 UTC m=+808.399652980" lastFinishedPulling="2025-10-02 16:30:55.064753645 +0000 UTC m=+813.813983182" observedRunningTime="2025-10-02 16:30:56.099308373 +0000 UTC m=+814.848537900" watchObservedRunningTime="2025-10-02 16:30:56.102992399 +0000 UTC m=+814.852221926" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.117868 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-9gswz" podStartSLOduration=3.226283589 podStartE2EDuration="9.117847582s" podCreationTimestamp="2025-10-02 16:30:47 +0000 UTC" firstStartedPulling="2025-10-02 16:30:49.201372862 +0000 UTC m=+807.950602389" lastFinishedPulling="2025-10-02 16:30:55.092936855 +0000 UTC m=+813.842166382" observedRunningTime="2025-10-02 16:30:56.114842063 +0000 UTC m=+814.864071610" watchObservedRunningTime="2025-10-02 16:30:56.117847582 +0000 UTC m=+814.867077109" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.690574 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-74w2b"] Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.693111 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.704036 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74w2b"] Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.833911 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-catalog-content\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.833963 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-utilities\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.834020 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n27j\" (UniqueName: \"kubernetes.io/projected/a998e834-6096-4e41-862c-41e0950f7cb3-kube-api-access-2n27j\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.935473 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-utilities\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.935544 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n27j\" (UniqueName: \"kubernetes.io/projected/a998e834-6096-4e41-862c-41e0950f7cb3-kube-api-access-2n27j\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.935650 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-catalog-content\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.936082 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-utilities\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.936279 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-catalog-content\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:56 crc kubenswrapper[4882]: I1002 16:30:56.963028 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n27j\" (UniqueName: \"kubernetes.io/projected/a998e834-6096-4e41-862c-41e0950f7cb3-kube-api-access-2n27j\") pod \"certified-operators-74w2b\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:57 crc kubenswrapper[4882]: I1002 16:30:57.017529 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:30:57 crc kubenswrapper[4882]: I1002 16:30:57.509407 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74w2b"] Oct 02 16:30:57 crc kubenswrapper[4882]: W1002 16:30:57.549407 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda998e834_6096_4e41_862c_41e0950f7cb3.slice/crio-5b4806883ff854cf559c44e623b2f6031a266718bd0f2cd429730291ea2a593d WatchSource:0}: Error finding container 5b4806883ff854cf559c44e623b2f6031a266718bd0f2cd429730291ea2a593d: Status 404 returned error can't find the container with id 5b4806883ff854cf559c44e623b2f6031a266718bd0f2cd429730291ea2a593d Oct 02 16:30:58 crc kubenswrapper[4882]: I1002 16:30:58.065159 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl7tg" event={"ID":"8822b1a5-437e-49aa-a712-1882510ea1d1","Type":"ContainerStarted","Data":"18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e"} Oct 02 16:30:58 crc kubenswrapper[4882]: I1002 16:30:58.071549 4882 generic.go:334] "Generic (PLEG): container finished" podID="a998e834-6096-4e41-862c-41e0950f7cb3" containerID="2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005" exitCode=0 Oct 02 16:30:58 crc kubenswrapper[4882]: I1002 16:30:58.073082 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74w2b" event={"ID":"a998e834-6096-4e41-862c-41e0950f7cb3","Type":"ContainerDied","Data":"2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005"} Oct 02 16:30:58 crc kubenswrapper[4882]: I1002 16:30:58.073118 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74w2b" event={"ID":"a998e834-6096-4e41-862c-41e0950f7cb3","Type":"ContainerStarted","Data":"5b4806883ff854cf559c44e623b2f6031a266718bd0f2cd429730291ea2a593d"} Oct 02 16:30:58 crc kubenswrapper[4882]: I1002 16:30:58.099183 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pl7tg" podStartSLOduration=3.046642087 podStartE2EDuration="10.099158055s" podCreationTimestamp="2025-10-02 16:30:48 +0000 UTC" firstStartedPulling="2025-10-02 16:30:49.987747425 +0000 UTC m=+808.736976952" lastFinishedPulling="2025-10-02 16:30:57.040263393 +0000 UTC m=+815.789492920" observedRunningTime="2025-10-02 16:30:58.097679291 +0000 UTC m=+816.846908818" watchObservedRunningTime="2025-10-02 16:30:58.099158055 +0000 UTC m=+816.848387582" Oct 02 16:30:58 crc kubenswrapper[4882]: I1002 16:30:58.668565 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:58 crc kubenswrapper[4882]: I1002 16:30:58.668915 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:58 crc kubenswrapper[4882]: I1002 16:30:58.674561 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:59 crc kubenswrapper[4882]: I1002 16:30:59.029575 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:59 crc kubenswrapper[4882]: I1002 16:30:59.029640 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:59 crc kubenswrapper[4882]: I1002 16:30:59.081734 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" event={"ID":"fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e","Type":"ContainerStarted","Data":"6be5d63939c200cc6f80954c2f70d751a0d18984ff5e4e72a35de8ed6254b6d9"} Oct 02 16:30:59 crc kubenswrapper[4882]: I1002 16:30:59.083186 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:30:59 crc kubenswrapper[4882]: I1002 16:30:59.086119 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ff46c6c9c-l4lbs" Oct 02 16:30:59 crc kubenswrapper[4882]: I1002 16:30:59.102786 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-mf42j" podStartSLOduration=3.273397457 podStartE2EDuration="12.102761361s" podCreationTimestamp="2025-10-02 16:30:47 +0000 UTC" firstStartedPulling="2025-10-02 16:30:49.6984059 +0000 UTC m=+808.447635427" lastFinishedPulling="2025-10-02 16:30:58.527769814 +0000 UTC m=+817.276999331" observedRunningTime="2025-10-02 16:30:59.098090933 +0000 UTC m=+817.847320450" watchObservedRunningTime="2025-10-02 16:30:59.102761361 +0000 UTC m=+817.851990878" Oct 02 16:30:59 crc kubenswrapper[4882]: I1002 16:30:59.176089 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9f6pj"] Oct 02 16:31:01 crc kubenswrapper[4882]: I1002 16:31:01.108858 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74w2b" event={"ID":"a998e834-6096-4e41-862c-41e0950f7cb3","Type":"ContainerStarted","Data":"b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86"} Oct 02 16:31:02 crc kubenswrapper[4882]: I1002 16:31:02.118172 4882 generic.go:334] "Generic (PLEG): container finished" podID="a998e834-6096-4e41-862c-41e0950f7cb3" containerID="b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86" exitCode=0 Oct 02 16:31:02 crc kubenswrapper[4882]: I1002 16:31:02.118270 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74w2b" event={"ID":"a998e834-6096-4e41-862c-41e0950f7cb3","Type":"ContainerDied","Data":"b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86"} Oct 02 16:31:03 crc kubenswrapper[4882]: I1002 16:31:03.124686 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74w2b" event={"ID":"a998e834-6096-4e41-862c-41e0950f7cb3","Type":"ContainerStarted","Data":"6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48"} Oct 02 16:31:03 crc kubenswrapper[4882]: I1002 16:31:03.147522 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-74w2b" podStartSLOduration=2.7647344609999998 podStartE2EDuration="7.147506182s" podCreationTimestamp="2025-10-02 16:30:56 +0000 UTC" firstStartedPulling="2025-10-02 16:30:58.368286024 +0000 UTC m=+817.117515551" lastFinishedPulling="2025-10-02 16:31:02.751057755 +0000 UTC m=+821.500287272" observedRunningTime="2025-10-02 16:31:03.147131904 +0000 UTC m=+821.896361441" watchObservedRunningTime="2025-10-02 16:31:03.147506182 +0000 UTC m=+821.896735709" Oct 02 16:31:04 crc kubenswrapper[4882]: I1002 16:31:04.187746 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-9gswz" Oct 02 16:31:07 crc kubenswrapper[4882]: I1002 16:31:07.018242 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:31:07 crc kubenswrapper[4882]: I1002 16:31:07.018633 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:31:07 crc kubenswrapper[4882]: I1002 16:31:07.061941 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:31:07 crc kubenswrapper[4882]: I1002 16:31:07.197905 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:31:07 crc kubenswrapper[4882]: I1002 16:31:07.887567 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74w2b"] Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.068041 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.154395 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-skbdp" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.158171 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-74w2b" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" containerName="registry-server" containerID="cri-o://6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48" gracePeriod=2 Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.390399 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.390477 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.390525 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.391138 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"731f80fa116efa00cebc80d74d3ffae0e209de2d426a9dc8596e8ee975fa0480"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.391198 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://731f80fa116efa00cebc80d74d3ffae0e209de2d426a9dc8596e8ee975fa0480" gracePeriod=600 Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.536630 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.624735 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-catalog-content\") pod \"a998e834-6096-4e41-862c-41e0950f7cb3\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.625038 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n27j\" (UniqueName: \"kubernetes.io/projected/a998e834-6096-4e41-862c-41e0950f7cb3-kube-api-access-2n27j\") pod \"a998e834-6096-4e41-862c-41e0950f7cb3\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.625135 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-utilities\") pod \"a998e834-6096-4e41-862c-41e0950f7cb3\" (UID: \"a998e834-6096-4e41-862c-41e0950f7cb3\") " Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.626152 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-utilities" (OuterVolumeSpecName: "utilities") pod "a998e834-6096-4e41-862c-41e0950f7cb3" (UID: "a998e834-6096-4e41-862c-41e0950f7cb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.631396 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a998e834-6096-4e41-862c-41e0950f7cb3-kube-api-access-2n27j" (OuterVolumeSpecName: "kube-api-access-2n27j") pod "a998e834-6096-4e41-862c-41e0950f7cb3" (UID: "a998e834-6096-4e41-862c-41e0950f7cb3"). InnerVolumeSpecName "kube-api-access-2n27j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.676599 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a998e834-6096-4e41-862c-41e0950f7cb3" (UID: "a998e834-6096-4e41-862c-41e0950f7cb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.726552 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.726592 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n27j\" (UniqueName: \"kubernetes.io/projected/a998e834-6096-4e41-862c-41e0950f7cb3-kube-api-access-2n27j\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:09 crc kubenswrapper[4882]: I1002 16:31:09.726609 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a998e834-6096-4e41-862c-41e0950f7cb3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.169426 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"731f80fa116efa00cebc80d74d3ffae0e209de2d426a9dc8596e8ee975fa0480"} Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.169316 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="731f80fa116efa00cebc80d74d3ffae0e209de2d426a9dc8596e8ee975fa0480" exitCode=0 Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.169779 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"f44cd146d47205c1f6441437b6ff7350cb43493b056fc71a20f480df78729e48"} Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.169922 4882 scope.go:117] "RemoveContainer" containerID="29f4555583b1d199e6bae425394a26d6363317a57d30e13ad2c6217759f86807" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.173292 4882 generic.go:334] "Generic (PLEG): container finished" podID="a998e834-6096-4e41-862c-41e0950f7cb3" containerID="6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48" exitCode=0 Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.173346 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74w2b" event={"ID":"a998e834-6096-4e41-862c-41e0950f7cb3","Type":"ContainerDied","Data":"6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48"} Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.173365 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74w2b" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.173383 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74w2b" event={"ID":"a998e834-6096-4e41-862c-41e0950f7cb3","Type":"ContainerDied","Data":"5b4806883ff854cf559c44e623b2f6031a266718bd0f2cd429730291ea2a593d"} Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.197061 4882 scope.go:117] "RemoveContainer" containerID="6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.206370 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74w2b"] Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.213206 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-74w2b"] Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.223948 4882 scope.go:117] "RemoveContainer" containerID="b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.242005 4882 scope.go:117] "RemoveContainer" containerID="2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.255713 4882 scope.go:117] "RemoveContainer" containerID="6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48" Oct 02 16:31:10 crc kubenswrapper[4882]: E1002 16:31:10.256117 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48\": container with ID starting with 6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48 not found: ID does not exist" containerID="6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.256271 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48"} err="failed to get container status \"6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48\": rpc error: code = NotFound desc = could not find container \"6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48\": container with ID starting with 6983c5a6fe7d9133ebb5169a9250490953357e37653dd152b40efec33e031c48 not found: ID does not exist" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.256379 4882 scope.go:117] "RemoveContainer" containerID="b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86" Oct 02 16:31:10 crc kubenswrapper[4882]: E1002 16:31:10.256808 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86\": container with ID starting with b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86 not found: ID does not exist" containerID="b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.256854 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86"} err="failed to get container status \"b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86\": rpc error: code = NotFound desc = could not find container \"b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86\": container with ID starting with b55a68a5bc5193c15543b050e434a80575063b7ff8849bc284adcc1ed5b9fc86 not found: ID does not exist" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.256882 4882 scope.go:117] "RemoveContainer" containerID="2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005" Oct 02 16:31:10 crc kubenswrapper[4882]: E1002 16:31:10.257434 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005\": container with ID starting with 2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005 not found: ID does not exist" containerID="2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.257712 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005"} err="failed to get container status \"2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005\": rpc error: code = NotFound desc = could not find container \"2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005\": container with ID starting with 2caf5282d4ded4f8208a2eb2e10bc730010292ce088b38b0ab6f8dd4bd072005 not found: ID does not exist" Oct 02 16:31:10 crc kubenswrapper[4882]: I1002 16:31:10.767145 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" path="/var/lib/kubelet/pods/a998e834-6096-4e41-862c-41e0950f7cb3/volumes" Oct 02 16:31:12 crc kubenswrapper[4882]: I1002 16:31:12.685740 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl7tg"] Oct 02 16:31:12 crc kubenswrapper[4882]: I1002 16:31:12.686323 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pl7tg" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerName="registry-server" containerID="cri-o://18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e" gracePeriod=2 Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.063720 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.173813 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-utilities\") pod \"8822b1a5-437e-49aa-a712-1882510ea1d1\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.173899 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdlsm\" (UniqueName: \"kubernetes.io/projected/8822b1a5-437e-49aa-a712-1882510ea1d1-kube-api-access-qdlsm\") pod \"8822b1a5-437e-49aa-a712-1882510ea1d1\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.173952 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-catalog-content\") pod \"8822b1a5-437e-49aa-a712-1882510ea1d1\" (UID: \"8822b1a5-437e-49aa-a712-1882510ea1d1\") " Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.175231 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-utilities" (OuterVolumeSpecName: "utilities") pod "8822b1a5-437e-49aa-a712-1882510ea1d1" (UID: "8822b1a5-437e-49aa-a712-1882510ea1d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.179684 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8822b1a5-437e-49aa-a712-1882510ea1d1-kube-api-access-qdlsm" (OuterVolumeSpecName: "kube-api-access-qdlsm") pod "8822b1a5-437e-49aa-a712-1882510ea1d1" (UID: "8822b1a5-437e-49aa-a712-1882510ea1d1"). InnerVolumeSpecName "kube-api-access-qdlsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.191551 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8822b1a5-437e-49aa-a712-1882510ea1d1" (UID: "8822b1a5-437e-49aa-a712-1882510ea1d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.196256 4882 generic.go:334] "Generic (PLEG): container finished" podID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerID="18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e" exitCode=0 Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.196299 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl7tg" event={"ID":"8822b1a5-437e-49aa-a712-1882510ea1d1","Type":"ContainerDied","Data":"18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e"} Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.196330 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl7tg" event={"ID":"8822b1a5-437e-49aa-a712-1882510ea1d1","Type":"ContainerDied","Data":"bc4a40c1ce77ec7a947eb2e32a5c7edea92f2adae89441c305995a505e8c333f"} Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.196349 4882 scope.go:117] "RemoveContainer" containerID="18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.196491 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl7tg" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.215970 4882 scope.go:117] "RemoveContainer" containerID="b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.229639 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl7tg"] Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.234207 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl7tg"] Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.256309 4882 scope.go:117] "RemoveContainer" containerID="fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.275844 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.275908 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdlsm\" (UniqueName: \"kubernetes.io/projected/8822b1a5-437e-49aa-a712-1882510ea1d1-kube-api-access-qdlsm\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.275920 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8822b1a5-437e-49aa-a712-1882510ea1d1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.280301 4882 scope.go:117] "RemoveContainer" containerID="18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e" Oct 02 16:31:13 crc kubenswrapper[4882]: E1002 16:31:13.280928 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e\": container with ID starting with 18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e not found: ID does not exist" containerID="18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.280993 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e"} err="failed to get container status \"18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e\": rpc error: code = NotFound desc = could not find container \"18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e\": container with ID starting with 18f1774382bf34120be0297ab9d79f10fa6a25ce215702ff88d72c653f74120e not found: ID does not exist" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.281031 4882 scope.go:117] "RemoveContainer" containerID="b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e" Oct 02 16:31:13 crc kubenswrapper[4882]: E1002 16:31:13.281995 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e\": container with ID starting with b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e not found: ID does not exist" containerID="b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.282044 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e"} err="failed to get container status \"b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e\": rpc error: code = NotFound desc = could not find container \"b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e\": container with ID starting with b7b94eb19448d282bf04117f52f7134a7b832236a548139862fdff8d50add29e not found: ID does not exist" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.282084 4882 scope.go:117] "RemoveContainer" containerID="fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc" Oct 02 16:31:13 crc kubenswrapper[4882]: E1002 16:31:13.282445 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc\": container with ID starting with fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc not found: ID does not exist" containerID="fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc" Oct 02 16:31:13 crc kubenswrapper[4882]: I1002 16:31:13.282485 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc"} err="failed to get container status \"fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc\": rpc error: code = NotFound desc = could not find container \"fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc\": container with ID starting with fb74ef1cab38ba021983391e798984463d6aafd80876439bfaa13f017f5a1dfc not found: ID does not exist" Oct 02 16:31:14 crc kubenswrapper[4882]: I1002 16:31:14.770094 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" path="/var/lib/kubelet/pods/8822b1a5-437e-49aa-a712-1882510ea1d1/volumes" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.343250 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82"] Oct 02 16:31:23 crc kubenswrapper[4882]: E1002 16:31:23.344579 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" containerName="extract-content" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.344599 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" containerName="extract-content" Oct 02 16:31:23 crc kubenswrapper[4882]: E1002 16:31:23.344608 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerName="registry-server" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.344615 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerName="registry-server" Oct 02 16:31:23 crc kubenswrapper[4882]: E1002 16:31:23.344628 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerName="extract-utilities" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.344657 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerName="extract-utilities" Oct 02 16:31:23 crc kubenswrapper[4882]: E1002 16:31:23.344667 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" containerName="extract-utilities" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.344672 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" containerName="extract-utilities" Oct 02 16:31:23 crc kubenswrapper[4882]: E1002 16:31:23.344685 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" containerName="registry-server" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.344691 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" containerName="registry-server" Oct 02 16:31:23 crc kubenswrapper[4882]: E1002 16:31:23.344699 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerName="extract-content" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.344704 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerName="extract-content" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.344817 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a998e834-6096-4e41-862c-41e0950f7cb3" containerName="registry-server" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.344833 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="8822b1a5-437e-49aa-a712-1882510ea1d1" containerName="registry-server" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.345798 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.348132 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.353548 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82"] Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.429112 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6t5p\" (UniqueName: \"kubernetes.io/projected/2968d005-f404-4387-ac4c-739ba42a465e-kube-api-access-n6t5p\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.429183 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.429271 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.530025 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.530117 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.530208 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6t5p\" (UniqueName: \"kubernetes.io/projected/2968d005-f404-4387-ac4c-739ba42a465e-kube-api-access-n6t5p\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.530894 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.530969 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.551317 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6t5p\" (UniqueName: \"kubernetes.io/projected/2968d005-f404-4387-ac4c-739ba42a465e-kube-api-access-n6t5p\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:23 crc kubenswrapper[4882]: I1002 16:31:23.673577 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.085190 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82"] Oct 02 16:31:24 crc kubenswrapper[4882]: W1002 16:31:24.097810 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2968d005_f404_4387_ac4c_739ba42a465e.slice/crio-50ffce39c9199a7128ba68a7cea0c5c57949a7949017a34b91bee6ff4eb37017 WatchSource:0}: Error finding container 50ffce39c9199a7128ba68a7cea0c5c57949a7949017a34b91bee6ff4eb37017: Status 404 returned error can't find the container with id 50ffce39c9199a7128ba68a7cea0c5c57949a7949017a34b91bee6ff4eb37017 Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.214926 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9f6pj" podUID="c5af616c-8948-402c-97b8-3aadd17673d2" containerName="console" containerID="cri-o://4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0" gracePeriod=15 Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.297789 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" event={"ID":"2968d005-f404-4387-ac4c-739ba42a465e","Type":"ContainerStarted","Data":"3964e08d3d84b2cda5318ef71d09734908dc838cc11410cbed4e6601b25b87d8"} Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.297841 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" event={"ID":"2968d005-f404-4387-ac4c-739ba42a465e","Type":"ContainerStarted","Data":"50ffce39c9199a7128ba68a7cea0c5c57949a7949017a34b91bee6ff4eb37017"} Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.605898 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9f6pj_c5af616c-8948-402c-97b8-3aadd17673d2/console/0.log" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.606456 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.747326 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-console-config\") pod \"c5af616c-8948-402c-97b8-3aadd17673d2\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.747380 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzm9l\" (UniqueName: \"kubernetes.io/projected/c5af616c-8948-402c-97b8-3aadd17673d2-kube-api-access-lzm9l\") pod \"c5af616c-8948-402c-97b8-3aadd17673d2\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.747416 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-service-ca\") pod \"c5af616c-8948-402c-97b8-3aadd17673d2\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.747444 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-oauth-serving-cert\") pod \"c5af616c-8948-402c-97b8-3aadd17673d2\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.747548 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-serving-cert\") pod \"c5af616c-8948-402c-97b8-3aadd17673d2\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.747972 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-trusted-ca-bundle\") pod \"c5af616c-8948-402c-97b8-3aadd17673d2\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.748895 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c5af616c-8948-402c-97b8-3aadd17673d2" (UID: "c5af616c-8948-402c-97b8-3aadd17673d2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.748913 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-service-ca" (OuterVolumeSpecName: "service-ca") pod "c5af616c-8948-402c-97b8-3aadd17673d2" (UID: "c5af616c-8948-402c-97b8-3aadd17673d2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.748966 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-oauth-config\") pod \"c5af616c-8948-402c-97b8-3aadd17673d2\" (UID: \"c5af616c-8948-402c-97b8-3aadd17673d2\") " Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.749165 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-console-config" (OuterVolumeSpecName: "console-config") pod "c5af616c-8948-402c-97b8-3aadd17673d2" (UID: "c5af616c-8948-402c-97b8-3aadd17673d2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.749650 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c5af616c-8948-402c-97b8-3aadd17673d2" (UID: "c5af616c-8948-402c-97b8-3aadd17673d2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.749722 4882 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.749740 4882 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.749750 4882 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.757126 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5af616c-8948-402c-97b8-3aadd17673d2-kube-api-access-lzm9l" (OuterVolumeSpecName: "kube-api-access-lzm9l") pod "c5af616c-8948-402c-97b8-3aadd17673d2" (UID: "c5af616c-8948-402c-97b8-3aadd17673d2"). InnerVolumeSpecName "kube-api-access-lzm9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.760933 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c5af616c-8948-402c-97b8-3aadd17673d2" (UID: "c5af616c-8948-402c-97b8-3aadd17673d2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.761271 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c5af616c-8948-402c-97b8-3aadd17673d2" (UID: "c5af616c-8948-402c-97b8-3aadd17673d2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.850589 4882 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5af616c-8948-402c-97b8-3aadd17673d2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.850650 4882 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.850665 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzm9l\" (UniqueName: \"kubernetes.io/projected/c5af616c-8948-402c-97b8-3aadd17673d2-kube-api-access-lzm9l\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:24 crc kubenswrapper[4882]: I1002 16:31:24.850682 4882 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5af616c-8948-402c-97b8-3aadd17673d2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.306106 4882 generic.go:334] "Generic (PLEG): container finished" podID="2968d005-f404-4387-ac4c-739ba42a465e" containerID="3964e08d3d84b2cda5318ef71d09734908dc838cc11410cbed4e6601b25b87d8" exitCode=0 Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.306489 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" event={"ID":"2968d005-f404-4387-ac4c-739ba42a465e","Type":"ContainerDied","Data":"3964e08d3d84b2cda5318ef71d09734908dc838cc11410cbed4e6601b25b87d8"} Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.308824 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9f6pj_c5af616c-8948-402c-97b8-3aadd17673d2/console/0.log" Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.308856 4882 generic.go:334] "Generic (PLEG): container finished" podID="c5af616c-8948-402c-97b8-3aadd17673d2" containerID="4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0" exitCode=2 Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.308905 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9f6pj" event={"ID":"c5af616c-8948-402c-97b8-3aadd17673d2","Type":"ContainerDied","Data":"4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0"} Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.308952 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9f6pj" event={"ID":"c5af616c-8948-402c-97b8-3aadd17673d2","Type":"ContainerDied","Data":"e5a6c601d68cc183fb950d1e9f8e066d7b58892ebdcdebe15e027613e0c3a051"} Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.308969 4882 scope.go:117] "RemoveContainer" containerID="4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0" Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.309060 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9f6pj" Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.338000 4882 scope.go:117] "RemoveContainer" containerID="4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0" Oct 02 16:31:25 crc kubenswrapper[4882]: E1002 16:31:25.339224 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0\": container with ID starting with 4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0 not found: ID does not exist" containerID="4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0" Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.339264 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0"} err="failed to get container status \"4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0\": rpc error: code = NotFound desc = could not find container \"4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0\": container with ID starting with 4a99c6a9bbfe83f512bad47ee343daacb862798fb34f76cfa20444037aee49e0 not found: ID does not exist" Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.348541 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9f6pj"] Oct 02 16:31:25 crc kubenswrapper[4882]: I1002 16:31:25.351556 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9f6pj"] Oct 02 16:31:26 crc kubenswrapper[4882]: I1002 16:31:26.770295 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5af616c-8948-402c-97b8-3aadd17673d2" path="/var/lib/kubelet/pods/c5af616c-8948-402c-97b8-3aadd17673d2/volumes" Oct 02 16:31:28 crc kubenswrapper[4882]: I1002 16:31:28.336199 4882 generic.go:334] "Generic (PLEG): container finished" podID="2968d005-f404-4387-ac4c-739ba42a465e" containerID="4c8a1038147b18ecb5666d93d6ffcb944e5fe2c4508573c8702273d1df5d3032" exitCode=0 Oct 02 16:31:28 crc kubenswrapper[4882]: I1002 16:31:28.336336 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" event={"ID":"2968d005-f404-4387-ac4c-739ba42a465e","Type":"ContainerDied","Data":"4c8a1038147b18ecb5666d93d6ffcb944e5fe2c4508573c8702273d1df5d3032"} Oct 02 16:31:29 crc kubenswrapper[4882]: I1002 16:31:29.347575 4882 generic.go:334] "Generic (PLEG): container finished" podID="2968d005-f404-4387-ac4c-739ba42a465e" containerID="31286aff4ee95baadd62bc8ee6a2966cdcfe98f46352462bafbdc9252429a3b5" exitCode=0 Oct 02 16:31:29 crc kubenswrapper[4882]: I1002 16:31:29.347900 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" event={"ID":"2968d005-f404-4387-ac4c-739ba42a465e","Type":"ContainerDied","Data":"31286aff4ee95baadd62bc8ee6a2966cdcfe98f46352462bafbdc9252429a3b5"} Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.633614 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.727404 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-util\") pod \"2968d005-f404-4387-ac4c-739ba42a465e\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.727499 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6t5p\" (UniqueName: \"kubernetes.io/projected/2968d005-f404-4387-ac4c-739ba42a465e-kube-api-access-n6t5p\") pod \"2968d005-f404-4387-ac4c-739ba42a465e\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.727584 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-bundle\") pod \"2968d005-f404-4387-ac4c-739ba42a465e\" (UID: \"2968d005-f404-4387-ac4c-739ba42a465e\") " Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.728865 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-bundle" (OuterVolumeSpecName: "bundle") pod "2968d005-f404-4387-ac4c-739ba42a465e" (UID: "2968d005-f404-4387-ac4c-739ba42a465e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.736497 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2968d005-f404-4387-ac4c-739ba42a465e-kube-api-access-n6t5p" (OuterVolumeSpecName: "kube-api-access-n6t5p") pod "2968d005-f404-4387-ac4c-739ba42a465e" (UID: "2968d005-f404-4387-ac4c-739ba42a465e"). InnerVolumeSpecName "kube-api-access-n6t5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.739976 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-util" (OuterVolumeSpecName: "util") pod "2968d005-f404-4387-ac4c-739ba42a465e" (UID: "2968d005-f404-4387-ac4c-739ba42a465e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.829737 4882 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.829770 4882 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2968d005-f404-4387-ac4c-739ba42a465e-util\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:30 crc kubenswrapper[4882]: I1002 16:31:30.829782 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6t5p\" (UniqueName: \"kubernetes.io/projected/2968d005-f404-4387-ac4c-739ba42a465e-kube-api-access-n6t5p\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:31 crc kubenswrapper[4882]: I1002 16:31:31.362675 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" event={"ID":"2968d005-f404-4387-ac4c-739ba42a465e","Type":"ContainerDied","Data":"50ffce39c9199a7128ba68a7cea0c5c57949a7949017a34b91bee6ff4eb37017"} Oct 02 16:31:31 crc kubenswrapper[4882]: I1002 16:31:31.362715 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50ffce39c9199a7128ba68a7cea0c5c57949a7949017a34b91bee6ff4eb37017" Oct 02 16:31:31 crc kubenswrapper[4882]: I1002 16:31:31.362805 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.496517 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rfcb2"] Oct 02 16:31:37 crc kubenswrapper[4882]: E1002 16:31:37.497309 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2968d005-f404-4387-ac4c-739ba42a465e" containerName="pull" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.497325 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="2968d005-f404-4387-ac4c-739ba42a465e" containerName="pull" Oct 02 16:31:37 crc kubenswrapper[4882]: E1002 16:31:37.497340 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2968d005-f404-4387-ac4c-739ba42a465e" containerName="extract" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.497347 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="2968d005-f404-4387-ac4c-739ba42a465e" containerName="extract" Oct 02 16:31:37 crc kubenswrapper[4882]: E1002 16:31:37.497357 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2968d005-f404-4387-ac4c-739ba42a465e" containerName="util" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.497363 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="2968d005-f404-4387-ac4c-739ba42a465e" containerName="util" Oct 02 16:31:37 crc kubenswrapper[4882]: E1002 16:31:37.497377 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5af616c-8948-402c-97b8-3aadd17673d2" containerName="console" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.497383 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5af616c-8948-402c-97b8-3aadd17673d2" containerName="console" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.497477 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="2968d005-f404-4387-ac4c-739ba42a465e" containerName="extract" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.497490 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5af616c-8948-402c-97b8-3aadd17673d2" containerName="console" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.498435 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.507736 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rfcb2"] Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.620128 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-utilities\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.620171 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-catalog-content\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.620318 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xtcv\" (UniqueName: \"kubernetes.io/projected/088abb37-c02c-4662-ab65-8f4db78d17e4-kube-api-access-5xtcv\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.722149 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xtcv\" (UniqueName: \"kubernetes.io/projected/088abb37-c02c-4662-ab65-8f4db78d17e4-kube-api-access-5xtcv\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.722234 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-catalog-content\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.722287 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-utilities\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.722832 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-utilities\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.722867 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-catalog-content\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.746617 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xtcv\" (UniqueName: \"kubernetes.io/projected/088abb37-c02c-4662-ab65-8f4db78d17e4-kube-api-access-5xtcv\") pod \"community-operators-rfcb2\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:37 crc kubenswrapper[4882]: I1002 16:31:37.859981 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:38 crc kubenswrapper[4882]: I1002 16:31:38.336046 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rfcb2"] Oct 02 16:31:38 crc kubenswrapper[4882]: I1002 16:31:38.419096 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfcb2" event={"ID":"088abb37-c02c-4662-ab65-8f4db78d17e4","Type":"ContainerStarted","Data":"da234aee9786befe711c99afc8a5532e77da0c952581335f1f89088bd2c25e7f"} Oct 02 16:31:39 crc kubenswrapper[4882]: I1002 16:31:39.425359 4882 generic.go:334] "Generic (PLEG): container finished" podID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerID="17f6bab85dd9a04460dcce08857b45a763e3ab3a8e9824e170ce4c5f4540eb3a" exitCode=0 Oct 02 16:31:39 crc kubenswrapper[4882]: I1002 16:31:39.425449 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfcb2" event={"ID":"088abb37-c02c-4662-ab65-8f4db78d17e4","Type":"ContainerDied","Data":"17f6bab85dd9a04460dcce08857b45a763e3ab3a8e9824e170ce4c5f4540eb3a"} Oct 02 16:31:41 crc kubenswrapper[4882]: I1002 16:31:41.438534 4882 generic.go:334] "Generic (PLEG): container finished" podID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerID="92723c67f38f1e15c649e3eb09dfc180b433670f50274eca6f8c0254916fe00f" exitCode=0 Oct 02 16:31:41 crc kubenswrapper[4882]: I1002 16:31:41.438583 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfcb2" event={"ID":"088abb37-c02c-4662-ab65-8f4db78d17e4","Type":"ContainerDied","Data":"92723c67f38f1e15c649e3eb09dfc180b433670f50274eca6f8c0254916fe00f"} Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.083200 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn"] Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.084310 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.087629 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.087889 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hqdsv" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.088063 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.089587 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.090073 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.182378 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-webhook-cert\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.182681 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-apiservice-cert\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.182754 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc22f\" (UniqueName: \"kubernetes.io/projected/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-kube-api-access-kc22f\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.188328 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn"] Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.283587 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc22f\" (UniqueName: \"kubernetes.io/projected/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-kube-api-access-kc22f\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.283686 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-webhook-cert\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.283747 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-apiservice-cert\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.291299 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-webhook-cert\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.291299 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-apiservice-cert\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.330313 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj"] Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.331975 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.335604 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8m2r9" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.335618 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.338165 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.342132 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj"] Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.343211 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc22f\" (UniqueName: \"kubernetes.io/projected/8efade14-68e0-4743-a2f7-e7b9eac6ceb2-kube-api-access-kc22f\") pod \"metallb-operator-controller-manager-84d6bb9698-dxdmn\" (UID: \"8efade14-68e0-4743-a2f7-e7b9eac6ceb2\") " pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.384945 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eac510c5-342f-431f-8b27-c6919e7703aa-webhook-cert\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.385018 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eac510c5-342f-431f-8b27-c6919e7703aa-apiservice-cert\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.385125 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gtll\" (UniqueName: \"kubernetes.io/projected/eac510c5-342f-431f-8b27-c6919e7703aa-kube-api-access-8gtll\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.457976 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.489936 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gtll\" (UniqueName: \"kubernetes.io/projected/eac510c5-342f-431f-8b27-c6919e7703aa-kube-api-access-8gtll\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.490029 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eac510c5-342f-431f-8b27-c6919e7703aa-webhook-cert\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.490069 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eac510c5-342f-431f-8b27-c6919e7703aa-apiservice-cert\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.494029 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eac510c5-342f-431f-8b27-c6919e7703aa-webhook-cert\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.498813 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eac510c5-342f-431f-8b27-c6919e7703aa-apiservice-cert\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.509852 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gtll\" (UniqueName: \"kubernetes.io/projected/eac510c5-342f-431f-8b27-c6919e7703aa-kube-api-access-8gtll\") pod \"metallb-operator-webhook-server-7d58b8bdc4-n4klj\" (UID: \"eac510c5-342f-431f-8b27-c6919e7703aa\") " pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.673155 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.925540 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj"] Oct 02 16:31:42 crc kubenswrapper[4882]: W1002 16:31:42.929450 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8efade14_68e0_4743_a2f7_e7b9eac6ceb2.slice/crio-65d31c552bd608db1de50c83170937a07fc9747a79f4e4736e2afccac7425a61 WatchSource:0}: Error finding container 65d31c552bd608db1de50c83170937a07fc9747a79f4e4736e2afccac7425a61: Status 404 returned error can't find the container with id 65d31c552bd608db1de50c83170937a07fc9747a79f4e4736e2afccac7425a61 Oct 02 16:31:42 crc kubenswrapper[4882]: I1002 16:31:42.930202 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn"] Oct 02 16:31:43 crc kubenswrapper[4882]: I1002 16:31:43.457618 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" event={"ID":"8efade14-68e0-4743-a2f7-e7b9eac6ceb2","Type":"ContainerStarted","Data":"65d31c552bd608db1de50c83170937a07fc9747a79f4e4736e2afccac7425a61"} Oct 02 16:31:43 crc kubenswrapper[4882]: I1002 16:31:43.458934 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" event={"ID":"eac510c5-342f-431f-8b27-c6919e7703aa","Type":"ContainerStarted","Data":"dc2bd81edcc0d0040ded6ca7d5823ee67a607584aebc4b5442cdfe22e724c4cd"} Oct 02 16:31:44 crc kubenswrapper[4882]: I1002 16:31:44.469957 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfcb2" event={"ID":"088abb37-c02c-4662-ab65-8f4db78d17e4","Type":"ContainerStarted","Data":"53e0cbb11866f92ebc74582d88c396361f95a738327cc8ea1b33dff834c5b3a0"} Oct 02 16:31:47 crc kubenswrapper[4882]: I1002 16:31:47.861073 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:47 crc kubenswrapper[4882]: I1002 16:31:47.862494 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:47 crc kubenswrapper[4882]: I1002 16:31:47.957825 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:47 crc kubenswrapper[4882]: I1002 16:31:47.988777 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rfcb2" podStartSLOduration=6.285939464 podStartE2EDuration="10.988758548s" podCreationTimestamp="2025-10-02 16:31:37 +0000 UTC" firstStartedPulling="2025-10-02 16:31:39.427734243 +0000 UTC m=+858.176963770" lastFinishedPulling="2025-10-02 16:31:44.130553327 +0000 UTC m=+862.879782854" observedRunningTime="2025-10-02 16:31:45.509434791 +0000 UTC m=+864.258664318" watchObservedRunningTime="2025-10-02 16:31:47.988758548 +0000 UTC m=+866.737988075" Oct 02 16:31:51 crc kubenswrapper[4882]: I1002 16:31:51.530603 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" event={"ID":"eac510c5-342f-431f-8b27-c6919e7703aa","Type":"ContainerStarted","Data":"cecdec41c7933a04bc7ef43e942f8744141f84731c279dae4b238e6f3aa41202"} Oct 02 16:31:51 crc kubenswrapper[4882]: I1002 16:31:51.531182 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:31:51 crc kubenswrapper[4882]: I1002 16:31:51.534726 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" event={"ID":"8efade14-68e0-4743-a2f7-e7b9eac6ceb2","Type":"ContainerStarted","Data":"24eb3d5a0fa3714bf9e6b621053dab3118dda3ba9a298f0347b75fd1831f161d"} Oct 02 16:31:51 crc kubenswrapper[4882]: I1002 16:31:51.534931 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:31:51 crc kubenswrapper[4882]: I1002 16:31:51.552721 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" podStartSLOduration=1.9793199179999998 podStartE2EDuration="9.552702991s" podCreationTimestamp="2025-10-02 16:31:42 +0000 UTC" firstStartedPulling="2025-10-02 16:31:42.929918511 +0000 UTC m=+861.679148038" lastFinishedPulling="2025-10-02 16:31:50.503301584 +0000 UTC m=+869.252531111" observedRunningTime="2025-10-02 16:31:51.550713703 +0000 UTC m=+870.299943240" watchObservedRunningTime="2025-10-02 16:31:51.552702991 +0000 UTC m=+870.301932538" Oct 02 16:31:51 crc kubenswrapper[4882]: I1002 16:31:51.580828 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" podStartSLOduration=2.028409595 podStartE2EDuration="9.580809786s" podCreationTimestamp="2025-10-02 16:31:42 +0000 UTC" firstStartedPulling="2025-10-02 16:31:42.933254522 +0000 UTC m=+861.682484059" lastFinishedPulling="2025-10-02 16:31:50.485654723 +0000 UTC m=+869.234884250" observedRunningTime="2025-10-02 16:31:51.57807794 +0000 UTC m=+870.327307467" watchObservedRunningTime="2025-10-02 16:31:51.580809786 +0000 UTC m=+870.330039313" Oct 02 16:31:57 crc kubenswrapper[4882]: I1002 16:31:57.905040 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:57 crc kubenswrapper[4882]: I1002 16:31:57.960142 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rfcb2"] Oct 02 16:31:58 crc kubenswrapper[4882]: I1002 16:31:58.575354 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rfcb2" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerName="registry-server" containerID="cri-o://53e0cbb11866f92ebc74582d88c396361f95a738327cc8ea1b33dff834c5b3a0" gracePeriod=2 Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.584954 4882 generic.go:334] "Generic (PLEG): container finished" podID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerID="53e0cbb11866f92ebc74582d88c396361f95a738327cc8ea1b33dff834c5b3a0" exitCode=0 Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.585030 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfcb2" event={"ID":"088abb37-c02c-4662-ab65-8f4db78d17e4","Type":"ContainerDied","Data":"53e0cbb11866f92ebc74582d88c396361f95a738327cc8ea1b33dff834c5b3a0"} Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.585254 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfcb2" event={"ID":"088abb37-c02c-4662-ab65-8f4db78d17e4","Type":"ContainerDied","Data":"da234aee9786befe711c99afc8a5532e77da0c952581335f1f89088bd2c25e7f"} Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.585269 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da234aee9786befe711c99afc8a5532e77da0c952581335f1f89088bd2c25e7f" Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.586896 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.641898 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-utilities\") pod \"088abb37-c02c-4662-ab65-8f4db78d17e4\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.642049 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xtcv\" (UniqueName: \"kubernetes.io/projected/088abb37-c02c-4662-ab65-8f4db78d17e4-kube-api-access-5xtcv\") pod \"088abb37-c02c-4662-ab65-8f4db78d17e4\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.642106 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-catalog-content\") pod \"088abb37-c02c-4662-ab65-8f4db78d17e4\" (UID: \"088abb37-c02c-4662-ab65-8f4db78d17e4\") " Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.642816 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-utilities" (OuterVolumeSpecName: "utilities") pod "088abb37-c02c-4662-ab65-8f4db78d17e4" (UID: "088abb37-c02c-4662-ab65-8f4db78d17e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.675440 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/088abb37-c02c-4662-ab65-8f4db78d17e4-kube-api-access-5xtcv" (OuterVolumeSpecName: "kube-api-access-5xtcv") pod "088abb37-c02c-4662-ab65-8f4db78d17e4" (UID: "088abb37-c02c-4662-ab65-8f4db78d17e4"). InnerVolumeSpecName "kube-api-access-5xtcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.701286 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "088abb37-c02c-4662-ab65-8f4db78d17e4" (UID: "088abb37-c02c-4662-ab65-8f4db78d17e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.743662 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.743711 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xtcv\" (UniqueName: \"kubernetes.io/projected/088abb37-c02c-4662-ab65-8f4db78d17e4-kube-api-access-5xtcv\") on node \"crc\" DevicePath \"\"" Oct 02 16:31:59 crc kubenswrapper[4882]: I1002 16:31:59.743726 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088abb37-c02c-4662-ab65-8f4db78d17e4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:32:00 crc kubenswrapper[4882]: I1002 16:32:00.591663 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfcb2" Oct 02 16:32:00 crc kubenswrapper[4882]: I1002 16:32:00.639379 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rfcb2"] Oct 02 16:32:00 crc kubenswrapper[4882]: I1002 16:32:00.642111 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rfcb2"] Oct 02 16:32:00 crc kubenswrapper[4882]: I1002 16:32:00.768295 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" path="/var/lib/kubelet/pods/088abb37-c02c-4662-ab65-8f4db78d17e4/volumes" Oct 02 16:32:02 crc kubenswrapper[4882]: I1002 16:32:02.679006 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d58b8bdc4-n4klj" Oct 02 16:32:22 crc kubenswrapper[4882]: I1002 16:32:22.461266 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84d6bb9698-dxdmn" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.323588 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zfhmf"] Oct 02 16:32:23 crc kubenswrapper[4882]: E1002 16:32:23.324421 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerName="extract-content" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.324438 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerName="extract-content" Oct 02 16:32:23 crc kubenswrapper[4882]: E1002 16:32:23.324463 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerName="registry-server" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.324471 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerName="registry-server" Oct 02 16:32:23 crc kubenswrapper[4882]: E1002 16:32:23.324493 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerName="extract-utilities" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.324502 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerName="extract-utilities" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.324632 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="088abb37-c02c-4662-ab65-8f4db78d17e4" containerName="registry-server" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.327345 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.328679 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6"] Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.329427 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.330391 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.332098 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-2vpnx" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.332110 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.332111 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.340266 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6"] Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.417452 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-xrh84"] Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.418669 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.425391 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.425406 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.428876 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.429524 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-b6vpv" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.447712 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-9dk8h"] Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.452486 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.464788 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-frr-sockets\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.464876 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.464890 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-frr-conf\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.465299 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa396476-fa37-4443-af9a-1cf39f701d65-metrics-certs\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.465333 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aa396476-fa37-4443-af9a-1cf39f701d65-frr-startup\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.465422 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4fl\" (UniqueName: \"kubernetes.io/projected/aa396476-fa37-4443-af9a-1cf39f701d65-kube-api-access-8l4fl\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.465496 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75484683-14ac-4898-becf-2faf5970437d-cert\") pod \"frr-k8s-webhook-server-64bf5d555-z72g6\" (UID: \"75484683-14ac-4898-becf-2faf5970437d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.465520 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbp9j\" (UniqueName: \"kubernetes.io/projected/75484683-14ac-4898-becf-2faf5970437d-kube-api-access-jbp9j\") pod \"frr-k8s-webhook-server-64bf5d555-z72g6\" (UID: \"75484683-14ac-4898-becf-2faf5970437d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.465567 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-reloader\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.465590 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-metrics\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.512706 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-9dk8h"] Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567557 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-reloader\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567624 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-metrics\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567671 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-frr-sockets\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567723 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bb683c00-49a1-49b9-8ab5-5d519f4ad310-metallb-excludel2\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567755 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-metrics-certs\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567780 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-frr-conf\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567801 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-cert\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567836 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa396476-fa37-4443-af9a-1cf39f701d65-metrics-certs\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567851 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567868 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aa396476-fa37-4443-af9a-1cf39f701d65-frr-startup\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567907 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4fl\" (UniqueName: \"kubernetes.io/projected/aa396476-fa37-4443-af9a-1cf39f701d65-kube-api-access-8l4fl\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.567936 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-metrics-certs\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.568155 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxk5c\" (UniqueName: \"kubernetes.io/projected/3495a5a2-8e62-47ee-9f17-478981d8fa27-kube-api-access-vxk5c\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.568192 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75484683-14ac-4898-becf-2faf5970437d-cert\") pod \"frr-k8s-webhook-server-64bf5d555-z72g6\" (UID: \"75484683-14ac-4898-becf-2faf5970437d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.568242 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbp9j\" (UniqueName: \"kubernetes.io/projected/75484683-14ac-4898-becf-2faf5970437d-kube-api-access-jbp9j\") pod \"frr-k8s-webhook-server-64bf5d555-z72g6\" (UID: \"75484683-14ac-4898-becf-2faf5970437d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.568275 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2d6b\" (UniqueName: \"kubernetes.io/projected/bb683c00-49a1-49b9-8ab5-5d519f4ad310-kube-api-access-f2d6b\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.568793 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-reloader\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.569195 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-metrics\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.569515 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-frr-sockets\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.569821 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aa396476-fa37-4443-af9a-1cf39f701d65-frr-conf\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.570141 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aa396476-fa37-4443-af9a-1cf39f701d65-frr-startup\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.580969 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa396476-fa37-4443-af9a-1cf39f701d65-metrics-certs\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.581003 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75484683-14ac-4898-becf-2faf5970437d-cert\") pod \"frr-k8s-webhook-server-64bf5d555-z72g6\" (UID: \"75484683-14ac-4898-becf-2faf5970437d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.594576 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4fl\" (UniqueName: \"kubernetes.io/projected/aa396476-fa37-4443-af9a-1cf39f701d65-kube-api-access-8l4fl\") pod \"frr-k8s-zfhmf\" (UID: \"aa396476-fa37-4443-af9a-1cf39f701d65\") " pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.597871 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbp9j\" (UniqueName: \"kubernetes.io/projected/75484683-14ac-4898-becf-2faf5970437d-kube-api-access-jbp9j\") pod \"frr-k8s-webhook-server-64bf5d555-z72g6\" (UID: \"75484683-14ac-4898-becf-2faf5970437d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.647375 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.661054 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.669875 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bb683c00-49a1-49b9-8ab5-5d519f4ad310-metallb-excludel2\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.669930 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-metrics-certs\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.669958 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-cert\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.670002 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.670042 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-metrics-certs\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.670064 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxk5c\" (UniqueName: \"kubernetes.io/projected/3495a5a2-8e62-47ee-9f17-478981d8fa27-kube-api-access-vxk5c\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.670102 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2d6b\" (UniqueName: \"kubernetes.io/projected/bb683c00-49a1-49b9-8ab5-5d519f4ad310-kube-api-access-f2d6b\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.671134 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bb683c00-49a1-49b9-8ab5-5d519f4ad310-metallb-excludel2\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: E1002 16:32:23.671239 4882 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 16:32:23 crc kubenswrapper[4882]: E1002 16:32:23.671287 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist podName:bb683c00-49a1-49b9-8ab5-5d519f4ad310 nodeName:}" failed. No retries permitted until 2025-10-02 16:32:24.171272413 +0000 UTC m=+902.920501940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist") pod "speaker-xrh84" (UID: "bb683c00-49a1-49b9-8ab5-5d519f4ad310") : secret "metallb-memberlist" not found Oct 02 16:32:23 crc kubenswrapper[4882]: E1002 16:32:23.671576 4882 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 02 16:32:23 crc kubenswrapper[4882]: E1002 16:32:23.671688 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-metrics-certs podName:3495a5a2-8e62-47ee-9f17-478981d8fa27 nodeName:}" failed. No retries permitted until 2025-10-02 16:32:24.171652352 +0000 UTC m=+902.920881879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-metrics-certs") pod "controller-68d546b9d8-9dk8h" (UID: "3495a5a2-8e62-47ee-9f17-478981d8fa27") : secret "controller-certs-secret" not found Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.674363 4882 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.675339 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-metrics-certs\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.688825 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-cert\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.693129 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2d6b\" (UniqueName: \"kubernetes.io/projected/bb683c00-49a1-49b9-8ab5-5d519f4ad310-kube-api-access-f2d6b\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:23 crc kubenswrapper[4882]: I1002 16:32:23.698980 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxk5c\" (UniqueName: \"kubernetes.io/projected/3495a5a2-8e62-47ee-9f17-478981d8fa27-kube-api-access-vxk5c\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.084061 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6"] Oct 02 16:32:24 crc kubenswrapper[4882]: W1002 16:32:24.092340 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75484683_14ac_4898_becf_2faf5970437d.slice/crio-761240eaacfb97c01fbbe00d2f137ab51063dd682d3c8a37e571c08f89afeb2e WatchSource:0}: Error finding container 761240eaacfb97c01fbbe00d2f137ab51063dd682d3c8a37e571c08f89afeb2e: Status 404 returned error can't find the container with id 761240eaacfb97c01fbbe00d2f137ab51063dd682d3c8a37e571c08f89afeb2e Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.177523 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:24 crc kubenswrapper[4882]: E1002 16:32:24.177722 4882 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.177910 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-metrics-certs\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:24 crc kubenswrapper[4882]: E1002 16:32:24.177967 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist podName:bb683c00-49a1-49b9-8ab5-5d519f4ad310 nodeName:}" failed. No retries permitted until 2025-10-02 16:32:25.177948052 +0000 UTC m=+903.927177579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist") pod "speaker-xrh84" (UID: "bb683c00-49a1-49b9-8ab5-5d519f4ad310") : secret "metallb-memberlist" not found Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.184306 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3495a5a2-8e62-47ee-9f17-478981d8fa27-metrics-certs\") pod \"controller-68d546b9d8-9dk8h\" (UID: \"3495a5a2-8e62-47ee-9f17-478981d8fa27\") " pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.380342 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.588346 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-9dk8h"] Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.730782 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerStarted","Data":"8da6f6a1a4314ae983e39aed4d8ba9932f8cd23bda8f5d330fe8efc86843c207"} Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.733585 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" event={"ID":"75484683-14ac-4898-becf-2faf5970437d","Type":"ContainerStarted","Data":"761240eaacfb97c01fbbe00d2f137ab51063dd682d3c8a37e571c08f89afeb2e"} Oct 02 16:32:24 crc kubenswrapper[4882]: I1002 16:32:24.735386 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-9dk8h" event={"ID":"3495a5a2-8e62-47ee-9f17-478981d8fa27","Type":"ContainerStarted","Data":"48286d07bf1ff499da42f38ba4c9ed88bea20a6d46dbd5c2de35d9477b147e6f"} Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.191570 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.196691 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bb683c00-49a1-49b9-8ab5-5d519f4ad310-memberlist\") pod \"speaker-xrh84\" (UID: \"bb683c00-49a1-49b9-8ab5-5d519f4ad310\") " pod="metallb-system/speaker-xrh84" Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.232575 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xrh84" Oct 02 16:32:25 crc kubenswrapper[4882]: W1002 16:32:25.253636 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb683c00_49a1_49b9_8ab5_5d519f4ad310.slice/crio-2a50bf251e28beae23760d73e32a747632695f9750930ad6f06fe628e5bd708b WatchSource:0}: Error finding container 2a50bf251e28beae23760d73e32a747632695f9750930ad6f06fe628e5bd708b: Status 404 returned error can't find the container with id 2a50bf251e28beae23760d73e32a747632695f9750930ad6f06fe628e5bd708b Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.745359 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-9dk8h" event={"ID":"3495a5a2-8e62-47ee-9f17-478981d8fa27","Type":"ContainerStarted","Data":"dffd8b6e38bd834cfad749327f5d8229815a0dfe13426a44a61b0e7534e2b958"} Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.745650 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-9dk8h" event={"ID":"3495a5a2-8e62-47ee-9f17-478981d8fa27","Type":"ContainerStarted","Data":"fcb5bf705b2f49d14130d0f7449cd546fef35d2c0bc54f6b823bae024ba21abf"} Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.746448 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.747883 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xrh84" event={"ID":"bb683c00-49a1-49b9-8ab5-5d519f4ad310","Type":"ContainerStarted","Data":"0e4518060b08030803bf05d767f814fed7d203a7c626ae770438a0c648acc0ee"} Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.747903 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xrh84" event={"ID":"bb683c00-49a1-49b9-8ab5-5d519f4ad310","Type":"ContainerStarted","Data":"2a50bf251e28beae23760d73e32a747632695f9750930ad6f06fe628e5bd708b"} Oct 02 16:32:25 crc kubenswrapper[4882]: I1002 16:32:25.779353 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-9dk8h" podStartSLOduration=2.779336264 podStartE2EDuration="2.779336264s" podCreationTimestamp="2025-10-02 16:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:32:25.7787609 +0000 UTC m=+904.527990447" watchObservedRunningTime="2025-10-02 16:32:25.779336264 +0000 UTC m=+904.528565791" Oct 02 16:32:26 crc kubenswrapper[4882]: I1002 16:32:26.758446 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xrh84" event={"ID":"bb683c00-49a1-49b9-8ab5-5d519f4ad310","Type":"ContainerStarted","Data":"9672381eceea265c12dceeabf916d059655f98589f11dea5c42c131d49084719"} Oct 02 16:32:26 crc kubenswrapper[4882]: I1002 16:32:26.797955 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-xrh84" podStartSLOduration=3.797937021 podStartE2EDuration="3.797937021s" podCreationTimestamp="2025-10-02 16:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:32:26.793086702 +0000 UTC m=+905.542316239" watchObservedRunningTime="2025-10-02 16:32:26.797937021 +0000 UTC m=+905.547166548" Oct 02 16:32:27 crc kubenswrapper[4882]: I1002 16:32:27.764548 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-xrh84" Oct 02 16:32:31 crc kubenswrapper[4882]: I1002 16:32:31.792924 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" event={"ID":"75484683-14ac-4898-becf-2faf5970437d","Type":"ContainerStarted","Data":"f79de87968f39e9ef29514e3b7162b968b8465ec67138bd72159859d76e12cb9"} Oct 02 16:32:31 crc kubenswrapper[4882]: I1002 16:32:31.793756 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:31 crc kubenswrapper[4882]: I1002 16:32:31.798148 4882 generic.go:334] "Generic (PLEG): container finished" podID="aa396476-fa37-4443-af9a-1cf39f701d65" containerID="52a0a134144eae2e77ff21493f4eff286393eeaed35db9f73c4a4be68278ab1c" exitCode=0 Oct 02 16:32:31 crc kubenswrapper[4882]: I1002 16:32:31.798194 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerDied","Data":"52a0a134144eae2e77ff21493f4eff286393eeaed35db9f73c4a4be68278ab1c"} Oct 02 16:32:31 crc kubenswrapper[4882]: I1002 16:32:31.813600 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" podStartSLOduration=1.722048225 podStartE2EDuration="8.813581483s" podCreationTimestamp="2025-10-02 16:32:23 +0000 UTC" firstStartedPulling="2025-10-02 16:32:24.095631715 +0000 UTC m=+902.844861252" lastFinishedPulling="2025-10-02 16:32:31.187164983 +0000 UTC m=+909.936394510" observedRunningTime="2025-10-02 16:32:31.809206046 +0000 UTC m=+910.558435593" watchObservedRunningTime="2025-10-02 16:32:31.813581483 +0000 UTC m=+910.562811010" Oct 02 16:32:32 crc kubenswrapper[4882]: I1002 16:32:32.806127 4882 generic.go:334] "Generic (PLEG): container finished" podID="aa396476-fa37-4443-af9a-1cf39f701d65" containerID="c9f289c98195190cceb447a2aff5582d456c857690d0e784548760e420216b58" exitCode=0 Oct 02 16:32:32 crc kubenswrapper[4882]: I1002 16:32:32.806260 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerDied","Data":"c9f289c98195190cceb447a2aff5582d456c857690d0e784548760e420216b58"} Oct 02 16:32:33 crc kubenswrapper[4882]: I1002 16:32:33.814333 4882 generic.go:334] "Generic (PLEG): container finished" podID="aa396476-fa37-4443-af9a-1cf39f701d65" containerID="c61d4c54242f6576373d40bda1dd1d9fe97f2ab3c35e54347eec8d50766bd41c" exitCode=0 Oct 02 16:32:33 crc kubenswrapper[4882]: I1002 16:32:33.814428 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerDied","Data":"c61d4c54242f6576373d40bda1dd1d9fe97f2ab3c35e54347eec8d50766bd41c"} Oct 02 16:32:34 crc kubenswrapper[4882]: I1002 16:32:34.384114 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-9dk8h" Oct 02 16:32:34 crc kubenswrapper[4882]: I1002 16:32:34.825466 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerStarted","Data":"e3a1a413e40cb7037c284ff95fdcb8954ccc5104eedb36d4b4d09f553bd5d079"} Oct 02 16:32:34 crc kubenswrapper[4882]: I1002 16:32:34.825531 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerStarted","Data":"5b577c6c77ddf9dd9f45132c1d1bee6a59dd8e4063904a15e971fe7c78009978"} Oct 02 16:32:34 crc kubenswrapper[4882]: I1002 16:32:34.825541 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerStarted","Data":"07e85bbf555e322ea785adaa7129f3e17f6039d5fa468960b55d030973cda93a"} Oct 02 16:32:34 crc kubenswrapper[4882]: I1002 16:32:34.825553 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerStarted","Data":"c25c31c683dcdff82f44d1566809ea6caa8ca50597f43565a80df05e77abb9fd"} Oct 02 16:32:34 crc kubenswrapper[4882]: I1002 16:32:34.825564 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerStarted","Data":"69afebc94a74664740a596609531236674baa4ae266e0834dfe4d5e1c089f022"} Oct 02 16:32:35 crc kubenswrapper[4882]: I1002 16:32:35.236848 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-xrh84" Oct 02 16:32:35 crc kubenswrapper[4882]: I1002 16:32:35.836658 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zfhmf" event={"ID":"aa396476-fa37-4443-af9a-1cf39f701d65","Type":"ContainerStarted","Data":"1341017d225b914eb195b93c77173e60da5d80d227f3794b80e23d1291cf5fd0"} Oct 02 16:32:35 crc kubenswrapper[4882]: I1002 16:32:35.836850 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:35 crc kubenswrapper[4882]: I1002 16:32:35.868677 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zfhmf" podStartSLOduration=5.8492875269999995 podStartE2EDuration="12.868655535s" podCreationTimestamp="2025-10-02 16:32:23 +0000 UTC" firstStartedPulling="2025-10-02 16:32:24.177980413 +0000 UTC m=+902.927209940" lastFinishedPulling="2025-10-02 16:32:31.197348411 +0000 UTC m=+909.946577948" observedRunningTime="2025-10-02 16:32:35.866669127 +0000 UTC m=+914.615898654" watchObservedRunningTime="2025-10-02 16:32:35.868655535 +0000 UTC m=+914.617885072" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.114046 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm"] Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.115608 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.117604 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.126703 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm"] Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.265128 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6bl\" (UniqueName: \"kubernetes.io/projected/3dbaddb4-a180-45f5-af37-81327245a4dd-kube-api-access-8p6bl\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.265561 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.265585 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.366580 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6bl\" (UniqueName: \"kubernetes.io/projected/3dbaddb4-a180-45f5-af37-81327245a4dd-kube-api-access-8p6bl\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.366718 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.366745 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.367189 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.367277 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.398476 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6bl\" (UniqueName: \"kubernetes.io/projected/3dbaddb4-a180-45f5-af37-81327245a4dd-kube-api-access-8p6bl\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.432987 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.628287 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm"] Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.850303 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" event={"ID":"3dbaddb4-a180-45f5-af37-81327245a4dd","Type":"ContainerStarted","Data":"45eab478df12fb9453e41dfefe9da8068d6bd9ced978cb24eaf403fb8fb19c5d"} Oct 02 16:32:37 crc kubenswrapper[4882]: I1002 16:32:37.850666 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" event={"ID":"3dbaddb4-a180-45f5-af37-81327245a4dd","Type":"ContainerStarted","Data":"5bca011acc39f71bf517bb016f60ed27af2aa2b7ad38c625d29ac7ffe8e58d0f"} Oct 02 16:32:38 crc kubenswrapper[4882]: I1002 16:32:38.647977 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:38 crc kubenswrapper[4882]: I1002 16:32:38.706750 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:38 crc kubenswrapper[4882]: I1002 16:32:38.857455 4882 generic.go:334] "Generic (PLEG): container finished" podID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerID="45eab478df12fb9453e41dfefe9da8068d6bd9ced978cb24eaf403fb8fb19c5d" exitCode=0 Oct 02 16:32:38 crc kubenswrapper[4882]: I1002 16:32:38.857598 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" event={"ID":"3dbaddb4-a180-45f5-af37-81327245a4dd","Type":"ContainerDied","Data":"45eab478df12fb9453e41dfefe9da8068d6bd9ced978cb24eaf403fb8fb19c5d"} Oct 02 16:32:41 crc kubenswrapper[4882]: I1002 16:32:41.884812 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" event={"ID":"3dbaddb4-a180-45f5-af37-81327245a4dd","Type":"ContainerStarted","Data":"0c89c79cec948f9ba436b42773f95af9674d8addb97c3243ed7a7c67edd0eb37"} Oct 02 16:32:42 crc kubenswrapper[4882]: I1002 16:32:42.891237 4882 generic.go:334] "Generic (PLEG): container finished" podID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerID="0c89c79cec948f9ba436b42773f95af9674d8addb97c3243ed7a7c67edd0eb37" exitCode=0 Oct 02 16:32:42 crc kubenswrapper[4882]: I1002 16:32:42.891254 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" event={"ID":"3dbaddb4-a180-45f5-af37-81327245a4dd","Type":"ContainerDied","Data":"0c89c79cec948f9ba436b42773f95af9674d8addb97c3243ed7a7c67edd0eb37"} Oct 02 16:32:43 crc kubenswrapper[4882]: I1002 16:32:43.653384 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zfhmf" Oct 02 16:32:43 crc kubenswrapper[4882]: I1002 16:32:43.672767 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-z72g6" Oct 02 16:32:43 crc kubenswrapper[4882]: I1002 16:32:43.903164 4882 generic.go:334] "Generic (PLEG): container finished" podID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerID="f9e115d656940700a814ce70968ba10c6b75fa617bfc4667ec6d6b015fd1c4dc" exitCode=0 Oct 02 16:32:43 crc kubenswrapper[4882]: I1002 16:32:43.903206 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" event={"ID":"3dbaddb4-a180-45f5-af37-81327245a4dd","Type":"ContainerDied","Data":"f9e115d656940700a814ce70968ba10c6b75fa617bfc4667ec6d6b015fd1c4dc"} Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.164470 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.195136 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-bundle\") pod \"3dbaddb4-a180-45f5-af37-81327245a4dd\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.195191 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-util\") pod \"3dbaddb4-a180-45f5-af37-81327245a4dd\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.195276 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p6bl\" (UniqueName: \"kubernetes.io/projected/3dbaddb4-a180-45f5-af37-81327245a4dd-kube-api-access-8p6bl\") pod \"3dbaddb4-a180-45f5-af37-81327245a4dd\" (UID: \"3dbaddb4-a180-45f5-af37-81327245a4dd\") " Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.196451 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-bundle" (OuterVolumeSpecName: "bundle") pod "3dbaddb4-a180-45f5-af37-81327245a4dd" (UID: "3dbaddb4-a180-45f5-af37-81327245a4dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.203523 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dbaddb4-a180-45f5-af37-81327245a4dd-kube-api-access-8p6bl" (OuterVolumeSpecName: "kube-api-access-8p6bl") pod "3dbaddb4-a180-45f5-af37-81327245a4dd" (UID: "3dbaddb4-a180-45f5-af37-81327245a4dd"). InnerVolumeSpecName "kube-api-access-8p6bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.206779 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-util" (OuterVolumeSpecName: "util") pod "3dbaddb4-a180-45f5-af37-81327245a4dd" (UID: "3dbaddb4-a180-45f5-af37-81327245a4dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.297315 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p6bl\" (UniqueName: \"kubernetes.io/projected/3dbaddb4-a180-45f5-af37-81327245a4dd-kube-api-access-8p6bl\") on node \"crc\" DevicePath \"\"" Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.297365 4882 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.297376 4882 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dbaddb4-a180-45f5-af37-81327245a4dd-util\") on node \"crc\" DevicePath \"\"" Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.916971 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" event={"ID":"3dbaddb4-a180-45f5-af37-81327245a4dd","Type":"ContainerDied","Data":"5bca011acc39f71bf517bb016f60ed27af2aa2b7ad38c625d29ac7ffe8e58d0f"} Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.917283 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm" Oct 02 16:32:45 crc kubenswrapper[4882]: I1002 16:32:45.917294 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bca011acc39f71bf517bb016f60ed27af2aa2b7ad38c625d29ac7ffe8e58d0f" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.942800 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls"] Oct 02 16:32:49 crc kubenswrapper[4882]: E1002 16:32:49.943417 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerName="extract" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.943431 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerName="extract" Oct 02 16:32:49 crc kubenswrapper[4882]: E1002 16:32:49.943450 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerName="pull" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.943456 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerName="pull" Oct 02 16:32:49 crc kubenswrapper[4882]: E1002 16:32:49.943466 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerName="util" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.943474 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerName="util" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.943592 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbaddb4-a180-45f5-af37-81327245a4dd" containerName="extract" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.944006 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.948513 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.949665 4882 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-zgqh2" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.950071 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 02 16:32:49 crc kubenswrapper[4882]: I1002 16:32:49.975625 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls"] Oct 02 16:32:50 crc kubenswrapper[4882]: I1002 16:32:50.062003 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jznh\" (UniqueName: \"kubernetes.io/projected/d93f70cf-2d5c-48dd-ac12-555bb190d000-kube-api-access-4jznh\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2qqls\" (UID: \"d93f70cf-2d5c-48dd-ac12-555bb190d000\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls" Oct 02 16:32:50 crc kubenswrapper[4882]: I1002 16:32:50.163765 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jznh\" (UniqueName: \"kubernetes.io/projected/d93f70cf-2d5c-48dd-ac12-555bb190d000-kube-api-access-4jznh\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2qqls\" (UID: \"d93f70cf-2d5c-48dd-ac12-555bb190d000\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls" Oct 02 16:32:50 crc kubenswrapper[4882]: I1002 16:32:50.183572 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jznh\" (UniqueName: \"kubernetes.io/projected/d93f70cf-2d5c-48dd-ac12-555bb190d000-kube-api-access-4jznh\") pod \"cert-manager-operator-controller-manager-57cd46d6d-2qqls\" (UID: \"d93f70cf-2d5c-48dd-ac12-555bb190d000\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls" Oct 02 16:32:50 crc kubenswrapper[4882]: I1002 16:32:50.279528 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls" Oct 02 16:32:50 crc kubenswrapper[4882]: I1002 16:32:50.510867 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls"] Oct 02 16:32:50 crc kubenswrapper[4882]: W1002 16:32:50.526355 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd93f70cf_2d5c_48dd_ac12_555bb190d000.slice/crio-e838fd0f7a5ef54e4b97e6652ce49398713c4f40dbb2b6bbceae6f594a810486 WatchSource:0}: Error finding container e838fd0f7a5ef54e4b97e6652ce49398713c4f40dbb2b6bbceae6f594a810486: Status 404 returned error can't find the container with id e838fd0f7a5ef54e4b97e6652ce49398713c4f40dbb2b6bbceae6f594a810486 Oct 02 16:32:50 crc kubenswrapper[4882]: I1002 16:32:50.947068 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls" event={"ID":"d93f70cf-2d5c-48dd-ac12-555bb190d000","Type":"ContainerStarted","Data":"e838fd0f7a5ef54e4b97e6652ce49398713c4f40dbb2b6bbceae6f594a810486"} Oct 02 16:32:58 crc kubenswrapper[4882]: I1002 16:32:58.006088 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls" event={"ID":"d93f70cf-2d5c-48dd-ac12-555bb190d000","Type":"ContainerStarted","Data":"62c7b96b2b9eb4e15db8c73740aba30048862e0d32541f3f3f7e7ce72c07a74f"} Oct 02 16:32:58 crc kubenswrapper[4882]: I1002 16:32:58.027702 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-2qqls" podStartSLOduration=2.513973701 podStartE2EDuration="9.027683717s" podCreationTimestamp="2025-10-02 16:32:49 +0000 UTC" firstStartedPulling="2025-10-02 16:32:50.531385884 +0000 UTC m=+929.280615411" lastFinishedPulling="2025-10-02 16:32:57.0450959 +0000 UTC m=+935.794325427" observedRunningTime="2025-10-02 16:32:58.024882139 +0000 UTC m=+936.774111666" watchObservedRunningTime="2025-10-02 16:32:58.027683717 +0000 UTC m=+936.776913244" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.428525 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-nntcr"] Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.430080 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.432743 4882 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rrn58" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.433044 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.437241 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.488987 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-nntcr"] Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.504578 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1289b43b-8f30-4ec0-bfe8-e42890a1ebfb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-nntcr\" (UID: \"1289b43b-8f30-4ec0-bfe8-e42890a1ebfb\") " pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.504648 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-264mc\" (UniqueName: \"kubernetes.io/projected/1289b43b-8f30-4ec0-bfe8-e42890a1ebfb-kube-api-access-264mc\") pod \"cert-manager-webhook-d969966f-nntcr\" (UID: \"1289b43b-8f30-4ec0-bfe8-e42890a1ebfb\") " pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.607000 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1289b43b-8f30-4ec0-bfe8-e42890a1ebfb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-nntcr\" (UID: \"1289b43b-8f30-4ec0-bfe8-e42890a1ebfb\") " pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.607076 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-264mc\" (UniqueName: \"kubernetes.io/projected/1289b43b-8f30-4ec0-bfe8-e42890a1ebfb-kube-api-access-264mc\") pod \"cert-manager-webhook-d969966f-nntcr\" (UID: \"1289b43b-8f30-4ec0-bfe8-e42890a1ebfb\") " pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.629134 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1289b43b-8f30-4ec0-bfe8-e42890a1ebfb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-nntcr\" (UID: \"1289b43b-8f30-4ec0-bfe8-e42890a1ebfb\") " pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.635254 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-264mc\" (UniqueName: \"kubernetes.io/projected/1289b43b-8f30-4ec0-bfe8-e42890a1ebfb-kube-api-access-264mc\") pod \"cert-manager-webhook-d969966f-nntcr\" (UID: \"1289b43b-8f30-4ec0-bfe8-e42890a1ebfb\") " pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:00 crc kubenswrapper[4882]: I1002 16:33:00.755144 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:01 crc kubenswrapper[4882]: I1002 16:33:01.266157 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-nntcr"] Oct 02 16:33:02 crc kubenswrapper[4882]: I1002 16:33:02.036562 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-nntcr" event={"ID":"1289b43b-8f30-4ec0-bfe8-e42890a1ebfb","Type":"ContainerStarted","Data":"8b808859a7aa7e039db06a5a293446c2edc61852a4dc455b7765e9b3668e0a6a"} Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.120584 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv"] Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.121739 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.126001 4882 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-m2pzq" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.147878 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv"] Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.190787 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/677982a9-5a39-436c-abca-e7a8d0ac0a6f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lb8mv\" (UID: \"677982a9-5a39-436c-abca-e7a8d0ac0a6f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.190842 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rpx9\" (UniqueName: \"kubernetes.io/projected/677982a9-5a39-436c-abca-e7a8d0ac0a6f-kube-api-access-7rpx9\") pod \"cert-manager-cainjector-7d9f95dbf-lb8mv\" (UID: \"677982a9-5a39-436c-abca-e7a8d0ac0a6f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.292525 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/677982a9-5a39-436c-abca-e7a8d0ac0a6f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lb8mv\" (UID: \"677982a9-5a39-436c-abca-e7a8d0ac0a6f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.292620 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rpx9\" (UniqueName: \"kubernetes.io/projected/677982a9-5a39-436c-abca-e7a8d0ac0a6f-kube-api-access-7rpx9\") pod \"cert-manager-cainjector-7d9f95dbf-lb8mv\" (UID: \"677982a9-5a39-436c-abca-e7a8d0ac0a6f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.329650 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/677982a9-5a39-436c-abca-e7a8d0ac0a6f-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lb8mv\" (UID: \"677982a9-5a39-436c-abca-e7a8d0ac0a6f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.333294 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rpx9\" (UniqueName: \"kubernetes.io/projected/677982a9-5a39-436c-abca-e7a8d0ac0a6f-kube-api-access-7rpx9\") pod \"cert-manager-cainjector-7d9f95dbf-lb8mv\" (UID: \"677982a9-5a39-436c-abca-e7a8d0ac0a6f\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.447197 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" Oct 02 16:33:04 crc kubenswrapper[4882]: I1002 16:33:04.725419 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv"] Oct 02 16:33:05 crc kubenswrapper[4882]: I1002 16:33:05.065609 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" event={"ID":"677982a9-5a39-436c-abca-e7a8d0ac0a6f","Type":"ContainerStarted","Data":"3a1d3865417eb3ace481844cac53b6dc1dd4612fb8a74145d9607016fc7be787"} Oct 02 16:33:08 crc kubenswrapper[4882]: I1002 16:33:08.100637 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" event={"ID":"677982a9-5a39-436c-abca-e7a8d0ac0a6f","Type":"ContainerStarted","Data":"f2eeb9d7b4ed822bf9442922ffdfeb981685f8fa02d8c7f8993eced371ef856d"} Oct 02 16:33:08 crc kubenswrapper[4882]: I1002 16:33:08.104791 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-nntcr" event={"ID":"1289b43b-8f30-4ec0-bfe8-e42890a1ebfb","Type":"ContainerStarted","Data":"9faf02769402133ff662a97d6aae8de534b7e260ade500db51e1bb3beeda1896"} Oct 02 16:33:08 crc kubenswrapper[4882]: I1002 16:33:08.105596 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:08 crc kubenswrapper[4882]: I1002 16:33:08.119814 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lb8mv" podStartSLOduration=1.71543311 podStartE2EDuration="4.119794369s" podCreationTimestamp="2025-10-02 16:33:04 +0000 UTC" firstStartedPulling="2025-10-02 16:33:04.729051571 +0000 UTC m=+943.478281098" lastFinishedPulling="2025-10-02 16:33:07.13341283 +0000 UTC m=+945.882642357" observedRunningTime="2025-10-02 16:33:08.117781161 +0000 UTC m=+946.867010678" watchObservedRunningTime="2025-10-02 16:33:08.119794369 +0000 UTC m=+946.869023896" Oct 02 16:33:09 crc kubenswrapper[4882]: I1002 16:33:09.390745 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:33:09 crc kubenswrapper[4882]: I1002 16:33:09.390840 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.111038 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-nntcr" podStartSLOduration=5.254921828 podStartE2EDuration="11.111018463s" podCreationTimestamp="2025-10-02 16:33:00 +0000 UTC" firstStartedPulling="2025-10-02 16:33:01.280502762 +0000 UTC m=+940.029732289" lastFinishedPulling="2025-10-02 16:33:07.136599397 +0000 UTC m=+945.885828924" observedRunningTime="2025-10-02 16:33:08.144644816 +0000 UTC m=+946.893874343" watchObservedRunningTime="2025-10-02 16:33:11.111018463 +0000 UTC m=+949.860247990" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.112444 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-cws7m"] Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.113366 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.117232 4882 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zxtf5" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.129414 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-cws7m"] Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.209433 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz8v2\" (UniqueName: \"kubernetes.io/projected/d90adcf1-b8ee-4f9b-a722-430f2e782f1d-kube-api-access-dz8v2\") pod \"cert-manager-7d4cc89fcb-cws7m\" (UID: \"d90adcf1-b8ee-4f9b-a722-430f2e782f1d\") " pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.209564 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90adcf1-b8ee-4f9b-a722-430f2e782f1d-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-cws7m\" (UID: \"d90adcf1-b8ee-4f9b-a722-430f2e782f1d\") " pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.310654 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90adcf1-b8ee-4f9b-a722-430f2e782f1d-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-cws7m\" (UID: \"d90adcf1-b8ee-4f9b-a722-430f2e782f1d\") " pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.310803 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz8v2\" (UniqueName: \"kubernetes.io/projected/d90adcf1-b8ee-4f9b-a722-430f2e782f1d-kube-api-access-dz8v2\") pod \"cert-manager-7d4cc89fcb-cws7m\" (UID: \"d90adcf1-b8ee-4f9b-a722-430f2e782f1d\") " pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.338022 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90adcf1-b8ee-4f9b-a722-430f2e782f1d-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-cws7m\" (UID: \"d90adcf1-b8ee-4f9b-a722-430f2e782f1d\") " pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.338728 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz8v2\" (UniqueName: \"kubernetes.io/projected/d90adcf1-b8ee-4f9b-a722-430f2e782f1d-kube-api-access-dz8v2\") pod \"cert-manager-7d4cc89fcb-cws7m\" (UID: \"d90adcf1-b8ee-4f9b-a722-430f2e782f1d\") " pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.492958 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" Oct 02 16:33:11 crc kubenswrapper[4882]: I1002 16:33:11.756619 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-cws7m"] Oct 02 16:33:12 crc kubenswrapper[4882]: I1002 16:33:12.133462 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" event={"ID":"d90adcf1-b8ee-4f9b-a722-430f2e782f1d","Type":"ContainerStarted","Data":"af7693bcdb3dcd18c744289bdc4823371ab0ac880beb12eb5457d995a7e2d02f"} Oct 02 16:33:12 crc kubenswrapper[4882]: I1002 16:33:12.133543 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" event={"ID":"d90adcf1-b8ee-4f9b-a722-430f2e782f1d","Type":"ContainerStarted","Data":"391212aa8b76ed6d9423530ba2935561bb48a25242d19fd7e0516ee8699c30e4"} Oct 02 16:33:12 crc kubenswrapper[4882]: I1002 16:33:12.153447 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-cws7m" podStartSLOduration=1.153417469 podStartE2EDuration="1.153417469s" podCreationTimestamp="2025-10-02 16:33:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:33:12.151176395 +0000 UTC m=+950.900405922" watchObservedRunningTime="2025-10-02 16:33:12.153417469 +0000 UTC m=+950.902646996" Oct 02 16:33:15 crc kubenswrapper[4882]: I1002 16:33:15.759426 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-nntcr" Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.598571 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hzcwz"] Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.600378 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hzcwz" Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.603068 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.603754 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6skpn" Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.603887 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.650035 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hzcwz"] Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.737416 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nshmq\" (UniqueName: \"kubernetes.io/projected/74179993-8b72-4753-b104-008f95bb3fb0-kube-api-access-nshmq\") pod \"openstack-operator-index-hzcwz\" (UID: \"74179993-8b72-4753-b104-008f95bb3fb0\") " pod="openstack-operators/openstack-operator-index-hzcwz" Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.838608 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nshmq\" (UniqueName: \"kubernetes.io/projected/74179993-8b72-4753-b104-008f95bb3fb0-kube-api-access-nshmq\") pod \"openstack-operator-index-hzcwz\" (UID: \"74179993-8b72-4753-b104-008f95bb3fb0\") " pod="openstack-operators/openstack-operator-index-hzcwz" Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.866783 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nshmq\" (UniqueName: \"kubernetes.io/projected/74179993-8b72-4753-b104-008f95bb3fb0-kube-api-access-nshmq\") pod \"openstack-operator-index-hzcwz\" (UID: \"74179993-8b72-4753-b104-008f95bb3fb0\") " pod="openstack-operators/openstack-operator-index-hzcwz" Oct 02 16:33:19 crc kubenswrapper[4882]: I1002 16:33:19.944908 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hzcwz" Oct 02 16:33:20 crc kubenswrapper[4882]: I1002 16:33:20.387729 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hzcwz"] Oct 02 16:33:20 crc kubenswrapper[4882]: W1002 16:33:20.396666 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74179993_8b72_4753_b104_008f95bb3fb0.slice/crio-177456491d4bec2bddb55f5e3e9587313f065a375f68a20012703f97c004e984 WatchSource:0}: Error finding container 177456491d4bec2bddb55f5e3e9587313f065a375f68a20012703f97c004e984: Status 404 returned error can't find the container with id 177456491d4bec2bddb55f5e3e9587313f065a375f68a20012703f97c004e984 Oct 02 16:33:21 crc kubenswrapper[4882]: I1002 16:33:21.194756 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hzcwz" event={"ID":"74179993-8b72-4753-b104-008f95bb3fb0","Type":"ContainerStarted","Data":"177456491d4bec2bddb55f5e3e9587313f065a375f68a20012703f97c004e984"} Oct 02 16:33:22 crc kubenswrapper[4882]: I1002 16:33:22.204295 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hzcwz" event={"ID":"74179993-8b72-4753-b104-008f95bb3fb0","Type":"ContainerStarted","Data":"ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce"} Oct 02 16:33:22 crc kubenswrapper[4882]: I1002 16:33:22.224438 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hzcwz" podStartSLOduration=1.6962020070000001 podStartE2EDuration="3.224394623s" podCreationTimestamp="2025-10-02 16:33:19 +0000 UTC" firstStartedPulling="2025-10-02 16:33:20.403146209 +0000 UTC m=+959.152375736" lastFinishedPulling="2025-10-02 16:33:21.931338825 +0000 UTC m=+960.680568352" observedRunningTime="2025-10-02 16:33:22.219370121 +0000 UTC m=+960.968599648" watchObservedRunningTime="2025-10-02 16:33:22.224394623 +0000 UTC m=+960.973624150" Oct 02 16:33:22 crc kubenswrapper[4882]: I1002 16:33:22.757068 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hzcwz"] Oct 02 16:33:23 crc kubenswrapper[4882]: I1002 16:33:23.366059 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rlckz"] Oct 02 16:33:23 crc kubenswrapper[4882]: I1002 16:33:23.367598 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:23 crc kubenswrapper[4882]: I1002 16:33:23.376740 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rlckz"] Oct 02 16:33:23 crc kubenswrapper[4882]: I1002 16:33:23.502214 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqv24\" (UniqueName: \"kubernetes.io/projected/a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3-kube-api-access-kqv24\") pod \"openstack-operator-index-rlckz\" (UID: \"a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3\") " pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:23 crc kubenswrapper[4882]: I1002 16:33:23.604548 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqv24\" (UniqueName: \"kubernetes.io/projected/a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3-kube-api-access-kqv24\") pod \"openstack-operator-index-rlckz\" (UID: \"a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3\") " pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:23 crc kubenswrapper[4882]: I1002 16:33:23.633141 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqv24\" (UniqueName: \"kubernetes.io/projected/a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3-kube-api-access-kqv24\") pod \"openstack-operator-index-rlckz\" (UID: \"a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3\") " pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:23 crc kubenswrapper[4882]: I1002 16:33:23.694158 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:23 crc kubenswrapper[4882]: I1002 16:33:23.912456 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rlckz"] Oct 02 16:33:24 crc kubenswrapper[4882]: I1002 16:33:24.220156 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rlckz" event={"ID":"a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3","Type":"ContainerStarted","Data":"5c429143995d4c08685ccdaf5bc769ea70b0145dc6656efda3363f90c9d6b50f"} Oct 02 16:33:24 crc kubenswrapper[4882]: I1002 16:33:24.220376 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hzcwz" podUID="74179993-8b72-4753-b104-008f95bb3fb0" containerName="registry-server" containerID="cri-o://ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce" gracePeriod=2 Oct 02 16:33:24 crc kubenswrapper[4882]: I1002 16:33:24.555516 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hzcwz" Oct 02 16:33:24 crc kubenswrapper[4882]: I1002 16:33:24.720329 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nshmq\" (UniqueName: \"kubernetes.io/projected/74179993-8b72-4753-b104-008f95bb3fb0-kube-api-access-nshmq\") pod \"74179993-8b72-4753-b104-008f95bb3fb0\" (UID: \"74179993-8b72-4753-b104-008f95bb3fb0\") " Oct 02 16:33:24 crc kubenswrapper[4882]: I1002 16:33:24.728497 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74179993-8b72-4753-b104-008f95bb3fb0-kube-api-access-nshmq" (OuterVolumeSpecName: "kube-api-access-nshmq") pod "74179993-8b72-4753-b104-008f95bb3fb0" (UID: "74179993-8b72-4753-b104-008f95bb3fb0"). InnerVolumeSpecName "kube-api-access-nshmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:33:24 crc kubenswrapper[4882]: I1002 16:33:24.822026 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nshmq\" (UniqueName: \"kubernetes.io/projected/74179993-8b72-4753-b104-008f95bb3fb0-kube-api-access-nshmq\") on node \"crc\" DevicePath \"\"" Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.228235 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rlckz" event={"ID":"a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3","Type":"ContainerStarted","Data":"eb7550fb11fab2af0f2c850867bc23871bea419d37a2978b19684a18cd4d7f8b"} Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.229539 4882 generic.go:334] "Generic (PLEG): container finished" podID="74179993-8b72-4753-b104-008f95bb3fb0" containerID="ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce" exitCode=0 Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.229593 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hzcwz" event={"ID":"74179993-8b72-4753-b104-008f95bb3fb0","Type":"ContainerDied","Data":"ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce"} Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.229622 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hzcwz" event={"ID":"74179993-8b72-4753-b104-008f95bb3fb0","Type":"ContainerDied","Data":"177456491d4bec2bddb55f5e3e9587313f065a375f68a20012703f97c004e984"} Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.229645 4882 scope.go:117] "RemoveContainer" containerID="ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce" Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.229758 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hzcwz" Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.243284 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rlckz" podStartSLOduration=1.347268745 podStartE2EDuration="2.243262341s" podCreationTimestamp="2025-10-02 16:33:23 +0000 UTC" firstStartedPulling="2025-10-02 16:33:23.920129916 +0000 UTC m=+962.669359443" lastFinishedPulling="2025-10-02 16:33:24.816123512 +0000 UTC m=+963.565353039" observedRunningTime="2025-10-02 16:33:25.242764699 +0000 UTC m=+963.991994246" watchObservedRunningTime="2025-10-02 16:33:25.243262341 +0000 UTC m=+963.992491878" Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.251962 4882 scope.go:117] "RemoveContainer" containerID="ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce" Oct 02 16:33:25 crc kubenswrapper[4882]: E1002 16:33:25.252829 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce\": container with ID starting with ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce not found: ID does not exist" containerID="ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce" Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.252904 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce"} err="failed to get container status \"ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce\": rpc error: code = NotFound desc = could not find container \"ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce\": container with ID starting with ad7d2124cc970779d6e21e8cecf303a8729287bd46e27772f3df0611b0d196ce not found: ID does not exist" Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.261377 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hzcwz"] Oct 02 16:33:25 crc kubenswrapper[4882]: I1002 16:33:25.265985 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hzcwz"] Oct 02 16:33:26 crc kubenswrapper[4882]: I1002 16:33:26.769209 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74179993-8b72-4753-b104-008f95bb3fb0" path="/var/lib/kubelet/pods/74179993-8b72-4753-b104-008f95bb3fb0/volumes" Oct 02 16:33:33 crc kubenswrapper[4882]: I1002 16:33:33.694742 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:33 crc kubenswrapper[4882]: I1002 16:33:33.695269 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:33 crc kubenswrapper[4882]: I1002 16:33:33.731820 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:34 crc kubenswrapper[4882]: I1002 16:33:34.324939 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rlckz" Oct 02 16:33:39 crc kubenswrapper[4882]: I1002 16:33:39.389980 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:33:39 crc kubenswrapper[4882]: I1002 16:33:39.390442 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.846813 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb"] Oct 02 16:33:41 crc kubenswrapper[4882]: E1002 16:33:41.847675 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74179993-8b72-4753-b104-008f95bb3fb0" containerName="registry-server" Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.847691 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="74179993-8b72-4753-b104-008f95bb3fb0" containerName="registry-server" Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.847901 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="74179993-8b72-4753-b104-008f95bb3fb0" containerName="registry-server" Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.848928 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.851628 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cbsp4" Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.855124 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb"] Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.981292 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltd6d\" (UniqueName: \"kubernetes.io/projected/9d37872a-965e-4d6c-8f57-688b99de18bb-kube-api-access-ltd6d\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.981440 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-bundle\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:41 crc kubenswrapper[4882]: I1002 16:33:41.981483 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-util\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:42 crc kubenswrapper[4882]: I1002 16:33:42.083089 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltd6d\" (UniqueName: \"kubernetes.io/projected/9d37872a-965e-4d6c-8f57-688b99de18bb-kube-api-access-ltd6d\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:42 crc kubenswrapper[4882]: I1002 16:33:42.083238 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-bundle\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:42 crc kubenswrapper[4882]: I1002 16:33:42.083301 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-util\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:42 crc kubenswrapper[4882]: I1002 16:33:42.084070 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-bundle\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:42 crc kubenswrapper[4882]: I1002 16:33:42.084204 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-util\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:42 crc kubenswrapper[4882]: I1002 16:33:42.108079 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltd6d\" (UniqueName: \"kubernetes.io/projected/9d37872a-965e-4d6c-8f57-688b99de18bb-kube-api-access-ltd6d\") pod \"4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:42 crc kubenswrapper[4882]: I1002 16:33:42.170051 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:42 crc kubenswrapper[4882]: I1002 16:33:42.598862 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb"] Oct 02 16:33:43 crc kubenswrapper[4882]: I1002 16:33:43.380310 4882 generic.go:334] "Generic (PLEG): container finished" podID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerID="00544dbb2505df743518f8672af1f8515c459801d5892a992fdd1ecfe36d18d8" exitCode=0 Oct 02 16:33:43 crc kubenswrapper[4882]: I1002 16:33:43.380562 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" event={"ID":"9d37872a-965e-4d6c-8f57-688b99de18bb","Type":"ContainerDied","Data":"00544dbb2505df743518f8672af1f8515c459801d5892a992fdd1ecfe36d18d8"} Oct 02 16:33:43 crc kubenswrapper[4882]: I1002 16:33:43.380820 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" event={"ID":"9d37872a-965e-4d6c-8f57-688b99de18bb","Type":"ContainerStarted","Data":"dd0e5feeb6d5bb1098f4c6bfadbd3ad568c26a2d0a960056dd0c7f962475f640"} Oct 02 16:33:45 crc kubenswrapper[4882]: I1002 16:33:45.398685 4882 generic.go:334] "Generic (PLEG): container finished" podID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerID="638a4b677ae3abd6df7bfe417f88954c862394b21219b0dfa3c5e1d04e96e9bb" exitCode=0 Oct 02 16:33:45 crc kubenswrapper[4882]: I1002 16:33:45.398810 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" event={"ID":"9d37872a-965e-4d6c-8f57-688b99de18bb","Type":"ContainerDied","Data":"638a4b677ae3abd6df7bfe417f88954c862394b21219b0dfa3c5e1d04e96e9bb"} Oct 02 16:33:46 crc kubenswrapper[4882]: I1002 16:33:46.408843 4882 generic.go:334] "Generic (PLEG): container finished" podID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerID="9a15651e68eda5dee10cbfaef065d159575032916556132c783c549aa33c2245" exitCode=0 Oct 02 16:33:46 crc kubenswrapper[4882]: I1002 16:33:46.408932 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" event={"ID":"9d37872a-965e-4d6c-8f57-688b99de18bb","Type":"ContainerDied","Data":"9a15651e68eda5dee10cbfaef065d159575032916556132c783c549aa33c2245"} Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.678887 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.767715 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-util\") pod \"9d37872a-965e-4d6c-8f57-688b99de18bb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.767783 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-bundle\") pod \"9d37872a-965e-4d6c-8f57-688b99de18bb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.767891 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltd6d\" (UniqueName: \"kubernetes.io/projected/9d37872a-965e-4d6c-8f57-688b99de18bb-kube-api-access-ltd6d\") pod \"9d37872a-965e-4d6c-8f57-688b99de18bb\" (UID: \"9d37872a-965e-4d6c-8f57-688b99de18bb\") " Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.770261 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-bundle" (OuterVolumeSpecName: "bundle") pod "9d37872a-965e-4d6c-8f57-688b99de18bb" (UID: "9d37872a-965e-4d6c-8f57-688b99de18bb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.773675 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d37872a-965e-4d6c-8f57-688b99de18bb-kube-api-access-ltd6d" (OuterVolumeSpecName: "kube-api-access-ltd6d") pod "9d37872a-965e-4d6c-8f57-688b99de18bb" (UID: "9d37872a-965e-4d6c-8f57-688b99de18bb"). InnerVolumeSpecName "kube-api-access-ltd6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.783042 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-util" (OuterVolumeSpecName: "util") pod "9d37872a-965e-4d6c-8f57-688b99de18bb" (UID: "9d37872a-965e-4d6c-8f57-688b99de18bb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.869870 4882 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-util\") on node \"crc\" DevicePath \"\"" Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.869927 4882 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d37872a-965e-4d6c-8f57-688b99de18bb-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:33:47 crc kubenswrapper[4882]: I1002 16:33:47.869941 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltd6d\" (UniqueName: \"kubernetes.io/projected/9d37872a-965e-4d6c-8f57-688b99de18bb-kube-api-access-ltd6d\") on node \"crc\" DevicePath \"\"" Oct 02 16:33:48 crc kubenswrapper[4882]: I1002 16:33:48.427086 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" event={"ID":"9d37872a-965e-4d6c-8f57-688b99de18bb","Type":"ContainerDied","Data":"dd0e5feeb6d5bb1098f4c6bfadbd3ad568c26a2d0a960056dd0c7f962475f640"} Oct 02 16:33:48 crc kubenswrapper[4882]: I1002 16:33:48.427153 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd0e5feeb6d5bb1098f4c6bfadbd3ad568c26a2d0a960056dd0c7f962475f640" Oct 02 16:33:48 crc kubenswrapper[4882]: I1002 16:33:48.427124 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.457511 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd"] Oct 02 16:33:54 crc kubenswrapper[4882]: E1002 16:33:54.458477 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerName="pull" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.458493 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerName="pull" Oct 02 16:33:54 crc kubenswrapper[4882]: E1002 16:33:54.458508 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerName="util" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.458518 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerName="util" Oct 02 16:33:54 crc kubenswrapper[4882]: E1002 16:33:54.458529 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerName="extract" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.458536 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerName="extract" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.458680 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d37872a-965e-4d6c-8f57-688b99de18bb" containerName="extract" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.459493 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.462766 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-7qj49" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.521691 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd"] Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.565851 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td5c9\" (UniqueName: \"kubernetes.io/projected/bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c-kube-api-access-td5c9\") pod \"openstack-operator-controller-operator-bb8dc5db7-k2zgd\" (UID: \"bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c\") " pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.667093 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td5c9\" (UniqueName: \"kubernetes.io/projected/bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c-kube-api-access-td5c9\") pod \"openstack-operator-controller-operator-bb8dc5db7-k2zgd\" (UID: \"bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c\") " pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.692455 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td5c9\" (UniqueName: \"kubernetes.io/projected/bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c-kube-api-access-td5c9\") pod \"openstack-operator-controller-operator-bb8dc5db7-k2zgd\" (UID: \"bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c\") " pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" Oct 02 16:33:54 crc kubenswrapper[4882]: I1002 16:33:54.789005 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" Oct 02 16:33:55 crc kubenswrapper[4882]: I1002 16:33:55.030621 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd"] Oct 02 16:33:55 crc kubenswrapper[4882]: I1002 16:33:55.486372 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" event={"ID":"bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c","Type":"ContainerStarted","Data":"c0062aa88807b05b8b7c3934585a39ec2c46baf01c339bbc4fc1012222e951b7"} Oct 02 16:34:00 crc kubenswrapper[4882]: I1002 16:34:00.531951 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" event={"ID":"bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c","Type":"ContainerStarted","Data":"51c6e3db891ed135eaac7e5341f918c4242ab5cd8d9ed341c0fcdecf91389ff4"} Oct 02 16:34:03 crc kubenswrapper[4882]: I1002 16:34:03.553546 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" event={"ID":"bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c","Type":"ContainerStarted","Data":"ef90d60597bb27e6452be071537289cb04d5a6cfbd07a704421604b168dbc2ea"} Oct 02 16:34:03 crc kubenswrapper[4882]: I1002 16:34:03.555357 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" Oct 02 16:34:03 crc kubenswrapper[4882]: I1002 16:34:03.587710 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" podStartSLOduration=1.728396029 podStartE2EDuration="9.587683311s" podCreationTimestamp="2025-10-02 16:33:54 +0000 UTC" firstStartedPulling="2025-10-02 16:33:55.057364043 +0000 UTC m=+993.806593570" lastFinishedPulling="2025-10-02 16:34:02.916651325 +0000 UTC m=+1001.665880852" observedRunningTime="2025-10-02 16:34:03.581790885 +0000 UTC m=+1002.331020412" watchObservedRunningTime="2025-10-02 16:34:03.587683311 +0000 UTC m=+1002.336912838" Oct 02 16:34:04 crc kubenswrapper[4882]: I1002 16:34:04.560907 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-bb8dc5db7-k2zgd" Oct 02 16:34:09 crc kubenswrapper[4882]: I1002 16:34:09.390062 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:34:09 crc kubenswrapper[4882]: I1002 16:34:09.390445 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:34:09 crc kubenswrapper[4882]: I1002 16:34:09.390493 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:34:09 crc kubenswrapper[4882]: I1002 16:34:09.391048 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f44cd146d47205c1f6441437b6ff7350cb43493b056fc71a20f480df78729e48"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:34:09 crc kubenswrapper[4882]: I1002 16:34:09.391092 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://f44cd146d47205c1f6441437b6ff7350cb43493b056fc71a20f480df78729e48" gracePeriod=600 Oct 02 16:34:09 crc kubenswrapper[4882]: I1002 16:34:09.593019 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="f44cd146d47205c1f6441437b6ff7350cb43493b056fc71a20f480df78729e48" exitCode=0 Oct 02 16:34:09 crc kubenswrapper[4882]: I1002 16:34:09.593097 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"f44cd146d47205c1f6441437b6ff7350cb43493b056fc71a20f480df78729e48"} Oct 02 16:34:09 crc kubenswrapper[4882]: I1002 16:34:09.593365 4882 scope.go:117] "RemoveContainer" containerID="731f80fa116efa00cebc80d74d3ffae0e209de2d426a9dc8596e8ee975fa0480" Oct 02 16:34:10 crc kubenswrapper[4882]: I1002 16:34:10.603325 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"e31cec89ec1abb79b55918e38d3f35660f646818c4983c4b5f2f16b7f0dee66d"} Oct 02 16:34:38 crc kubenswrapper[4882]: I1002 16:34:38.957277 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz"] Oct 02 16:34:38 crc kubenswrapper[4882]: I1002 16:34:38.959055 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" Oct 02 16:34:38 crc kubenswrapper[4882]: I1002 16:34:38.961283 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-drrb5" Oct 02 16:34:38 crc kubenswrapper[4882]: I1002 16:34:38.996342 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.025273 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.026560 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.026658 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.028076 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.031334 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-crmgx" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.031559 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-76btd" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.036030 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.041575 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.042719 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.052283 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.054705 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-jt4gh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.055285 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.067041 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.068300 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.073185 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6rd65" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.086185 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.091024 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbl2n\" (UniqueName: \"kubernetes.io/projected/d32146d6-c9aa-4864-bab5-71beebc6c6ef-kube-api-access-sbl2n\") pod \"barbican-operator-controller-manager-6d6d64fdcf-mx2hz\" (UID: \"d32146d6-c9aa-4864-bab5-71beebc6c6ef\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.091313 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbpf\" (UniqueName: \"kubernetes.io/projected/cd42cabb-2afc-45b4-aecd-9d97980e0840-kube-api-access-mxbpf\") pod \"cinder-operator-controller-manager-8686fd99f7-nwf78\" (UID: \"cd42cabb-2afc-45b4-aecd-9d97980e0840\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.091454 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8c6c\" (UniqueName: \"kubernetes.io/projected/aece5dae-37d5-44cc-b83c-611712686fbb-kube-api-access-v8c6c\") pod \"designate-operator-controller-manager-58d86cd59d-j4mbs\" (UID: \"aece5dae-37d5-44cc-b83c-611712686fbb\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.105277 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.107014 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.112344 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bvgv8" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.112586 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.121297 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.122285 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.132738 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vzh4f" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.139298 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.140263 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.141716 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-r2dmz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.169339 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.183924 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193017 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-cert\") pod \"infra-operator-controller-manager-7c9978f67-lgpvs\" (UID: \"8b6c43c1-61b7-42ea-b66e-a19e54e67b50\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193081 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnfqk\" (UniqueName: \"kubernetes.io/projected/56d9b46f-821f-44dd-8139-4b3d1c2b1149-kube-api-access-jnfqk\") pod \"heat-operator-controller-manager-5ffbdb7ddf-4z9l6\" (UID: \"56d9b46f-821f-44dd-8139-4b3d1c2b1149\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193116 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vs2\" (UniqueName: \"kubernetes.io/projected/aee0bda3-847c-41b1-bb44-b2bd8875c9a1-kube-api-access-z9vs2\") pod \"horizon-operator-controller-manager-586b66cf4f-t5nnp\" (UID: \"aee0bda3-847c-41b1-bb44-b2bd8875c9a1\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193147 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxb7\" (UniqueName: \"kubernetes.io/projected/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-kube-api-access-vlxb7\") pod \"infra-operator-controller-manager-7c9978f67-lgpvs\" (UID: \"8b6c43c1-61b7-42ea-b66e-a19e54e67b50\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193201 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cckvl\" (UniqueName: \"kubernetes.io/projected/2c520177-6db1-41dd-808a-49f676f9870c-kube-api-access-cckvl\") pod \"ironic-operator-controller-manager-59b5fc9845-krzwg\" (UID: \"2c520177-6db1-41dd-808a-49f676f9870c\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193253 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbl2n\" (UniqueName: \"kubernetes.io/projected/d32146d6-c9aa-4864-bab5-71beebc6c6ef-kube-api-access-sbl2n\") pod \"barbican-operator-controller-manager-6d6d64fdcf-mx2hz\" (UID: \"d32146d6-c9aa-4864-bab5-71beebc6c6ef\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193278 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p25gh\" (UniqueName: \"kubernetes.io/projected/cdf5d013-2e3a-41f5-ad36-c870b219a572-kube-api-access-p25gh\") pod \"glance-operator-controller-manager-d785ddfd5-2sngh\" (UID: \"cdf5d013-2e3a-41f5-ad36-c870b219a572\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193333 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbpf\" (UniqueName: \"kubernetes.io/projected/cd42cabb-2afc-45b4-aecd-9d97980e0840-kube-api-access-mxbpf\") pod \"cinder-operator-controller-manager-8686fd99f7-nwf78\" (UID: \"cd42cabb-2afc-45b4-aecd-9d97980e0840\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.193400 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8c6c\" (UniqueName: \"kubernetes.io/projected/aece5dae-37d5-44cc-b83c-611712686fbb-kube-api-access-v8c6c\") pod \"designate-operator-controller-manager-58d86cd59d-j4mbs\" (UID: \"aece5dae-37d5-44cc-b83c-611712686fbb\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.197771 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.213375 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.214511 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.219152 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.220532 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.221984 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-kk55r" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.222186 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5qp4x" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.234750 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8c6c\" (UniqueName: \"kubernetes.io/projected/aece5dae-37d5-44cc-b83c-611712686fbb-kube-api-access-v8c6c\") pod \"designate-operator-controller-manager-58d86cd59d-j4mbs\" (UID: \"aece5dae-37d5-44cc-b83c-611712686fbb\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.234820 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbl2n\" (UniqueName: \"kubernetes.io/projected/d32146d6-c9aa-4864-bab5-71beebc6c6ef-kube-api-access-sbl2n\") pod \"barbican-operator-controller-manager-6d6d64fdcf-mx2hz\" (UID: \"d32146d6-c9aa-4864-bab5-71beebc6c6ef\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.236494 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.238100 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbpf\" (UniqueName: \"kubernetes.io/projected/cd42cabb-2afc-45b4-aecd-9d97980e0840-kube-api-access-mxbpf\") pod \"cinder-operator-controller-manager-8686fd99f7-nwf78\" (UID: \"cd42cabb-2afc-45b4-aecd-9d97980e0840\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.239576 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.258396 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.259900 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.263561 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-tnbc2" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.272241 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.295938 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.296804 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvl\" (UniqueName: \"kubernetes.io/projected/2c520177-6db1-41dd-808a-49f676f9870c-kube-api-access-cckvl\") pod \"ironic-operator-controller-manager-59b5fc9845-krzwg\" (UID: \"2c520177-6db1-41dd-808a-49f676f9870c\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.296841 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p25gh\" (UniqueName: \"kubernetes.io/projected/cdf5d013-2e3a-41f5-ad36-c870b219a572-kube-api-access-p25gh\") pod \"glance-operator-controller-manager-d785ddfd5-2sngh\" (UID: \"cdf5d013-2e3a-41f5-ad36-c870b219a572\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.296894 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-cert\") pod \"infra-operator-controller-manager-7c9978f67-lgpvs\" (UID: \"8b6c43c1-61b7-42ea-b66e-a19e54e67b50\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.296918 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl7s5\" (UniqueName: \"kubernetes.io/projected/08f8538a-699b-4ef0-9e48-da37040acdeb-kube-api-access-nl7s5\") pod \"manila-operator-controller-manager-66fdd975d9-qb4lj\" (UID: \"08f8538a-699b-4ef0-9e48-da37040acdeb\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.296937 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlq7\" (UniqueName: \"kubernetes.io/projected/131678a2-c80e-4990-85ef-9f8ed80cb4da-kube-api-access-nmlq7\") pod \"keystone-operator-controller-manager-6c9969c6c6-bhztc\" (UID: \"131678a2-c80e-4990-85ef-9f8ed80cb4da\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.296977 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnfqk\" (UniqueName: \"kubernetes.io/projected/56d9b46f-821f-44dd-8139-4b3d1c2b1149-kube-api-access-jnfqk\") pod \"heat-operator-controller-manager-5ffbdb7ddf-4z9l6\" (UID: \"56d9b46f-821f-44dd-8139-4b3d1c2b1149\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.296996 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vs2\" (UniqueName: \"kubernetes.io/projected/aee0bda3-847c-41b1-bb44-b2bd8875c9a1-kube-api-access-z9vs2\") pod \"horizon-operator-controller-manager-586b66cf4f-t5nnp\" (UID: \"aee0bda3-847c-41b1-bb44-b2bd8875c9a1\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.297016 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxb7\" (UniqueName: \"kubernetes.io/projected/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-kube-api-access-vlxb7\") pod \"infra-operator-controller-manager-7c9978f67-lgpvs\" (UID: \"8b6c43c1-61b7-42ea-b66e-a19e54e67b50\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.297328 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" Oct 02 16:34:39 crc kubenswrapper[4882]: E1002 16:34:39.297734 4882 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 16:34:39 crc kubenswrapper[4882]: E1002 16:34:39.297777 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-cert podName:8b6c43c1-61b7-42ea-b66e-a19e54e67b50 nodeName:}" failed. No retries permitted until 2025-10-02 16:34:39.797761683 +0000 UTC m=+1038.546991210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-cert") pod "infra-operator-controller-manager-7c9978f67-lgpvs" (UID: "8b6c43c1-61b7-42ea-b66e-a19e54e67b50") : secret "infra-operator-webhook-server-cert" not found Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.301452 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.302607 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vj6k6" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.309568 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.328608 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.331049 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.338076 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnfqk\" (UniqueName: \"kubernetes.io/projected/56d9b46f-821f-44dd-8139-4b3d1c2b1149-kube-api-access-jnfqk\") pod \"heat-operator-controller-manager-5ffbdb7ddf-4z9l6\" (UID: \"56d9b46f-821f-44dd-8139-4b3d1c2b1149\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.339992 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-2vvbb" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.348699 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.351578 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cckvl\" (UniqueName: \"kubernetes.io/projected/2c520177-6db1-41dd-808a-49f676f9870c-kube-api-access-cckvl\") pod \"ironic-operator-controller-manager-59b5fc9845-krzwg\" (UID: \"2c520177-6db1-41dd-808a-49f676f9870c\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.355905 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p25gh\" (UniqueName: \"kubernetes.io/projected/cdf5d013-2e3a-41f5-ad36-c870b219a572-kube-api-access-p25gh\") pod \"glance-operator-controller-manager-d785ddfd5-2sngh\" (UID: \"cdf5d013-2e3a-41f5-ad36-c870b219a572\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.357812 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxb7\" (UniqueName: \"kubernetes.io/projected/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-kube-api-access-vlxb7\") pod \"infra-operator-controller-manager-7c9978f67-lgpvs\" (UID: \"8b6c43c1-61b7-42ea-b66e-a19e54e67b50\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.360857 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.373730 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.383276 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.384442 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.390041 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vs2\" (UniqueName: \"kubernetes.io/projected/aee0bda3-847c-41b1-bb44-b2bd8875c9a1-kube-api-access-z9vs2\") pod \"horizon-operator-controller-manager-586b66cf4f-t5nnp\" (UID: \"aee0bda3-847c-41b1-bb44-b2bd8875c9a1\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.390397 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-668pm" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.391669 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.392294 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.393043 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.398034 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-69nm4" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.398272 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.398586 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5lj7\" (UniqueName: \"kubernetes.io/projected/63f23924-224e-40d8-901a-bbb56f30163d-kube-api-access-l5lj7\") pod \"nova-operator-controller-manager-5b45478b88-x6jfr\" (UID: \"63f23924-224e-40d8-901a-bbb56f30163d\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.398625 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl7s5\" (UniqueName: \"kubernetes.io/projected/08f8538a-699b-4ef0-9e48-da37040acdeb-kube-api-access-nl7s5\") pod \"manila-operator-controller-manager-66fdd975d9-qb4lj\" (UID: \"08f8538a-699b-4ef0-9e48-da37040acdeb\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.398648 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlq7\" (UniqueName: \"kubernetes.io/projected/131678a2-c80e-4990-85ef-9f8ed80cb4da-kube-api-access-nmlq7\") pod \"keystone-operator-controller-manager-6c9969c6c6-bhztc\" (UID: \"131678a2-c80e-4990-85ef-9f8ed80cb4da\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.398694 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f7l8\" (UniqueName: \"kubernetes.io/projected/75e8c312-5fcf-4106-953f-85077c2485aa-kube-api-access-2f7l8\") pod \"mariadb-operator-controller-manager-696ff4bcdd-ddwmt\" (UID: \"75e8c312-5fcf-4106-953f-85077c2485aa\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.398732 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbtv4\" (UniqueName: \"kubernetes.io/projected/8b1d8fdd-7f24-41c3-a604-7bd8df0972c4-kube-api-access-hbtv4\") pod \"neutron-operator-controller-manager-549fb68678-hhcmt\" (UID: \"8b1d8fdd-7f24-41c3-a604-7bd8df0972c4\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.425374 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.426416 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.432934 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl7s5\" (UniqueName: \"kubernetes.io/projected/08f8538a-699b-4ef0-9e48-da37040acdeb-kube-api-access-nl7s5\") pod \"manila-operator-controller-manager-66fdd975d9-qb4lj\" (UID: \"08f8538a-699b-4ef0-9e48-da37040acdeb\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.436828 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlq7\" (UniqueName: \"kubernetes.io/projected/131678a2-c80e-4990-85ef-9f8ed80cb4da-kube-api-access-nmlq7\") pod \"keystone-operator-controller-manager-6c9969c6c6-bhztc\" (UID: \"131678a2-c80e-4990-85ef-9f8ed80cb4da\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.437229 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.444418 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.444877 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lctvd" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.450949 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.465869 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.490190 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.499903 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5lj7\" (UniqueName: \"kubernetes.io/projected/63f23924-224e-40d8-901a-bbb56f30163d-kube-api-access-l5lj7\") pod \"nova-operator-controller-manager-5b45478b88-x6jfr\" (UID: \"63f23924-224e-40d8-901a-bbb56f30163d\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.500334 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f7l8\" (UniqueName: \"kubernetes.io/projected/75e8c312-5fcf-4106-953f-85077c2485aa-kube-api-access-2f7l8\") pod \"mariadb-operator-controller-manager-696ff4bcdd-ddwmt\" (UID: \"75e8c312-5fcf-4106-953f-85077c2485aa\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.500384 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgcvc\" (UniqueName: \"kubernetes.io/projected/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-kube-api-access-jgcvc\") pod \"openstack-baremetal-operator-controller-manager-7774cdf765ssbw4\" (UID: \"7a766286-4d8e-45ce-b444-3dcf4cd9bf57\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.500432 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbtv4\" (UniqueName: \"kubernetes.io/projected/8b1d8fdd-7f24-41c3-a604-7bd8df0972c4-kube-api-access-hbtv4\") pod \"neutron-operator-controller-manager-549fb68678-hhcmt\" (UID: \"8b1d8fdd-7f24-41c3-a604-7bd8df0972c4\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.500471 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-cert\") pod \"openstack-baremetal-operator-controller-manager-7774cdf765ssbw4\" (UID: \"7a766286-4d8e-45ce-b444-3dcf4cd9bf57\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.500496 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b96xr\" (UniqueName: \"kubernetes.io/projected/a50abab9-485d-40a2-80d7-5d3134a14908-kube-api-access-b96xr\") pod \"octavia-operator-controller-manager-b4444585c-dfckg\" (UID: \"a50abab9-485d-40a2-80d7-5d3134a14908\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.500552 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxvsk\" (UniqueName: \"kubernetes.io/projected/e05cf5dc-1295-4227-9527-f22cac2d26a0-kube-api-access-zxvsk\") pod \"ovn-operator-controller-manager-855d7949fc-tqmjh\" (UID: \"e05cf5dc-1295-4227-9527-f22cac2d26a0\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.523244 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.529190 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f7l8\" (UniqueName: \"kubernetes.io/projected/75e8c312-5fcf-4106-953f-85077c2485aa-kube-api-access-2f7l8\") pod \"mariadb-operator-controller-manager-696ff4bcdd-ddwmt\" (UID: \"75e8c312-5fcf-4106-953f-85077c2485aa\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.530039 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5lj7\" (UniqueName: \"kubernetes.io/projected/63f23924-224e-40d8-901a-bbb56f30163d-kube-api-access-l5lj7\") pod \"nova-operator-controller-manager-5b45478b88-x6jfr\" (UID: \"63f23924-224e-40d8-901a-bbb56f30163d\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.537099 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbtv4\" (UniqueName: \"kubernetes.io/projected/8b1d8fdd-7f24-41c3-a604-7bd8df0972c4-kube-api-access-hbtv4\") pod \"neutron-operator-controller-manager-549fb68678-hhcmt\" (UID: \"8b1d8fdd-7f24-41c3-a604-7bd8df0972c4\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.557263 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.563665 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.569156 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xbrld" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.575929 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-25zdk"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.580322 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.588009 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.590339 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qt4ng" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.590817 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.591540 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-25zdk"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.602500 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvsk\" (UniqueName: \"kubernetes.io/projected/e05cf5dc-1295-4227-9527-f22cac2d26a0-kube-api-access-zxvsk\") pod \"ovn-operator-controller-manager-855d7949fc-tqmjh\" (UID: \"e05cf5dc-1295-4227-9527-f22cac2d26a0\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.602599 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgcvc\" (UniqueName: \"kubernetes.io/projected/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-kube-api-access-jgcvc\") pod \"openstack-baremetal-operator-controller-manager-7774cdf765ssbw4\" (UID: \"7a766286-4d8e-45ce-b444-3dcf4cd9bf57\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.602633 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-cert\") pod \"openstack-baremetal-operator-controller-manager-7774cdf765ssbw4\" (UID: \"7a766286-4d8e-45ce-b444-3dcf4cd9bf57\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.602653 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b96xr\" (UniqueName: \"kubernetes.io/projected/a50abab9-485d-40a2-80d7-5d3134a14908-kube-api-access-b96xr\") pod \"octavia-operator-controller-manager-b4444585c-dfckg\" (UID: \"a50abab9-485d-40a2-80d7-5d3134a14908\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" Oct 02 16:34:39 crc kubenswrapper[4882]: E1002 16:34:39.603789 4882 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 16:34:39 crc kubenswrapper[4882]: E1002 16:34:39.603828 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-cert podName:7a766286-4d8e-45ce-b444-3dcf4cd9bf57 nodeName:}" failed. No retries permitted until 2025-10-02 16:34:40.103814781 +0000 UTC m=+1038.853044298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-cert") pod "openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" (UID: "7a766286-4d8e-45ce-b444-3dcf4cd9bf57") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.621485 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.642247 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxvsk\" (UniqueName: \"kubernetes.io/projected/e05cf5dc-1295-4227-9527-f22cac2d26a0-kube-api-access-zxvsk\") pod \"ovn-operator-controller-manager-855d7949fc-tqmjh\" (UID: \"e05cf5dc-1295-4227-9527-f22cac2d26a0\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.649842 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgcvc\" (UniqueName: \"kubernetes.io/projected/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-kube-api-access-jgcvc\") pod \"openstack-baremetal-operator-controller-manager-7774cdf765ssbw4\" (UID: \"7a766286-4d8e-45ce-b444-3dcf4cd9bf57\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.660376 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b96xr\" (UniqueName: \"kubernetes.io/projected/a50abab9-485d-40a2-80d7-5d3134a14908-kube-api-access-b96xr\") pod \"octavia-operator-controller-manager-b4444585c-dfckg\" (UID: \"a50abab9-485d-40a2-80d7-5d3134a14908\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.660482 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.667570 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.668699 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.684723 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vd79j" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.704139 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h7wv\" (UniqueName: \"kubernetes.io/projected/8f5b9997-c12c-4ba9-9efe-649f9d03e52e-kube-api-access-4h7wv\") pod \"swift-operator-controller-manager-76d5577b-25zdk\" (UID: \"8f5b9997-c12c-4ba9-9efe-649f9d03e52e\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.704267 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2dxp\" (UniqueName: \"kubernetes.io/projected/968e86ab-e082-4b95-a520-67672bc1d662-kube-api-access-j2dxp\") pod \"placement-operator-controller-manager-ccbfcb8c-hvnhq\" (UID: \"968e86ab-e082-4b95-a520-67672bc1d662\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.709047 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.728664 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.734085 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.736010 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.741957 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4gvgt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.760198 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.762880 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.793550 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.806046 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h7wv\" (UniqueName: \"kubernetes.io/projected/8f5b9997-c12c-4ba9-9efe-649f9d03e52e-kube-api-access-4h7wv\") pod \"swift-operator-controller-manager-76d5577b-25zdk\" (UID: \"8f5b9997-c12c-4ba9-9efe-649f9d03e52e\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.806103 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2dxp\" (UniqueName: \"kubernetes.io/projected/968e86ab-e082-4b95-a520-67672bc1d662-kube-api-access-j2dxp\") pod \"placement-operator-controller-manager-ccbfcb8c-hvnhq\" (UID: \"968e86ab-e082-4b95-a520-67672bc1d662\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.806199 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-cert\") pod \"infra-operator-controller-manager-7c9978f67-lgpvs\" (UID: \"8b6c43c1-61b7-42ea-b66e-a19e54e67b50\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.806261 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6c26\" (UniqueName: \"kubernetes.io/projected/7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7-kube-api-access-b6c26\") pod \"test-operator-controller-manager-6bb6dcddc-twxqz\" (UID: \"7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.806289 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk4dc\" (UniqueName: \"kubernetes.io/projected/a430ae1a-9023-4c9d-a4ee-ed474e3bbd88-kube-api-access-nk4dc\") pod \"telemetry-operator-controller-manager-5ffb97cddf-qmnwx\" (UID: \"a430ae1a-9023-4c9d-a4ee-ed474e3bbd88\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.811803 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.816339 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.822697 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b6c43c1-61b7-42ea-b66e-a19e54e67b50-cert\") pod \"infra-operator-controller-manager-7c9978f67-lgpvs\" (UID: \"8b6c43c1-61b7-42ea-b66e-a19e54e67b50\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.825497 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.825863 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.826073 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5w86w" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.829918 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2dxp\" (UniqueName: \"kubernetes.io/projected/968e86ab-e082-4b95-a520-67672bc1d662-kube-api-access-j2dxp\") pod \"placement-operator-controller-manager-ccbfcb8c-hvnhq\" (UID: \"968e86ab-e082-4b95-a520-67672bc1d662\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.831151 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h7wv\" (UniqueName: \"kubernetes.io/projected/8f5b9997-c12c-4ba9-9efe-649f9d03e52e-kube-api-access-4h7wv\") pod \"swift-operator-controller-manager-76d5577b-25zdk\" (UID: \"8f5b9997-c12c-4ba9-9efe-649f9d03e52e\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.855606 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.856886 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.863066 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2g2cm" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.863257 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.879185 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.908251 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk4dc\" (UniqueName: \"kubernetes.io/projected/a430ae1a-9023-4c9d-a4ee-ed474e3bbd88-kube-api-access-nk4dc\") pod \"telemetry-operator-controller-manager-5ffb97cddf-qmnwx\" (UID: \"a430ae1a-9023-4c9d-a4ee-ed474e3bbd88\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.908333 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54zdl\" (UniqueName: \"kubernetes.io/projected/96795030-dccf-45ce-9aa5-6104145b247d-kube-api-access-54zdl\") pod \"openstack-operator-controller-manager-6c9cff6d55-xtjhj\" (UID: \"96795030-dccf-45ce-9aa5-6104145b247d\") " pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.908436 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlfb8\" (UniqueName: \"kubernetes.io/projected/922b0b10-8ea8-4b15-b170-561690a29ff1-kube-api-access-dlfb8\") pod \"watcher-operator-controller-manager-5595cf6c95-5sm69\" (UID: \"922b0b10-8ea8-4b15-b170-561690a29ff1\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.908489 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert\") pod \"openstack-operator-controller-manager-6c9cff6d55-xtjhj\" (UID: \"96795030-dccf-45ce-9aa5-6104145b247d\") " pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.908534 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6c26\" (UniqueName: \"kubernetes.io/projected/7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7-kube-api-access-b6c26\") pod \"test-operator-controller-manager-6bb6dcddc-twxqz\" (UID: \"7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.934100 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.941271 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.943461 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6c26\" (UniqueName: \"kubernetes.io/projected/7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7-kube-api-access-b6c26\") pod \"test-operator-controller-manager-6bb6dcddc-twxqz\" (UID: \"7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.944350 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.946859 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk4dc\" (UniqueName: \"kubernetes.io/projected/a430ae1a-9023-4c9d-a4ee-ed474e3bbd88-kube-api-access-nk4dc\") pod \"telemetry-operator-controller-manager-5ffb97cddf-qmnwx\" (UID: \"a430ae1a-9023-4c9d-a4ee-ed474e3bbd88\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.949430 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jzwxt" Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.962095 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx"] Oct 02 16:34:39 crc kubenswrapper[4882]: I1002 16:34:39.990948 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.006374 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.011230 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6s7k\" (UniqueName: \"kubernetes.io/projected/6a6df481-1b16-40a6-b98a-efd2172c77d2-kube-api-access-h6s7k\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-27tjx\" (UID: \"6a6df481-1b16-40a6-b98a-efd2172c77d2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.011308 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlfb8\" (UniqueName: \"kubernetes.io/projected/922b0b10-8ea8-4b15-b170-561690a29ff1-kube-api-access-dlfb8\") pod \"watcher-operator-controller-manager-5595cf6c95-5sm69\" (UID: \"922b0b10-8ea8-4b15-b170-561690a29ff1\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.011352 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert\") pod \"openstack-operator-controller-manager-6c9cff6d55-xtjhj\" (UID: \"96795030-dccf-45ce-9aa5-6104145b247d\") " pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:40 crc kubenswrapper[4882]: E1002 16:34:40.012101 4882 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 16:34:40 crc kubenswrapper[4882]: E1002 16:34:40.012164 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert podName:96795030-dccf-45ce-9aa5-6104145b247d nodeName:}" failed. No retries permitted until 2025-10-02 16:34:40.512142323 +0000 UTC m=+1039.261371850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert") pod "openstack-operator-controller-manager-6c9cff6d55-xtjhj" (UID: "96795030-dccf-45ce-9aa5-6104145b247d") : secret "webhook-server-cert" not found Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.016470 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54zdl\" (UniqueName: \"kubernetes.io/projected/96795030-dccf-45ce-9aa5-6104145b247d-kube-api-access-54zdl\") pod \"openstack-operator-controller-manager-6c9cff6d55-xtjhj\" (UID: \"96795030-dccf-45ce-9aa5-6104145b247d\") " pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.019031 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.021540 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.035105 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.038142 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.052851 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlfb8\" (UniqueName: \"kubernetes.io/projected/922b0b10-8ea8-4b15-b170-561690a29ff1-kube-api-access-dlfb8\") pod \"watcher-operator-controller-manager-5595cf6c95-5sm69\" (UID: \"922b0b10-8ea8-4b15-b170-561690a29ff1\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.073630 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54zdl\" (UniqueName: \"kubernetes.io/projected/96795030-dccf-45ce-9aa5-6104145b247d-kube-api-access-54zdl\") pod \"openstack-operator-controller-manager-6c9cff6d55-xtjhj\" (UID: \"96795030-dccf-45ce-9aa5-6104145b247d\") " pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.077073 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.118554 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-cert\") pod \"openstack-baremetal-operator-controller-manager-7774cdf765ssbw4\" (UID: \"7a766286-4d8e-45ce-b444-3dcf4cd9bf57\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.118655 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6s7k\" (UniqueName: \"kubernetes.io/projected/6a6df481-1b16-40a6-b98a-efd2172c77d2-kube-api-access-h6s7k\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-27tjx\" (UID: \"6a6df481-1b16-40a6-b98a-efd2172c77d2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.130860 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a766286-4d8e-45ce-b444-3dcf4cd9bf57-cert\") pod \"openstack-baremetal-operator-controller-manager-7774cdf765ssbw4\" (UID: \"7a766286-4d8e-45ce-b444-3dcf4cd9bf57\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.141816 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.150933 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6s7k\" (UniqueName: \"kubernetes.io/projected/6a6df481-1b16-40a6-b98a-efd2172c77d2-kube-api-access-h6s7k\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-27tjx\" (UID: \"6a6df481-1b16-40a6-b98a-efd2172c77d2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.279386 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.414791 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.477993 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.521645 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.530799 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert\") pod \"openstack-operator-controller-manager-6c9cff6d55-xtjhj\" (UID: \"96795030-dccf-45ce-9aa5-6104145b247d\") " pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:40 crc kubenswrapper[4882]: E1002 16:34:40.531105 4882 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 16:34:40 crc kubenswrapper[4882]: E1002 16:34:40.531171 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert podName:96795030-dccf-45ce-9aa5-6104145b247d nodeName:}" failed. No retries permitted until 2025-10-02 16:34:41.531151725 +0000 UTC m=+1040.280381252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert") pod "openstack-operator-controller-manager-6c9cff6d55-xtjhj" (UID: "96795030-dccf-45ce-9aa5-6104145b247d") : secret "webhook-server-cert" not found Oct 02 16:34:40 crc kubenswrapper[4882]: W1002 16:34:40.568621 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaece5dae_37d5_44cc_b83c_611712686fbb.slice/crio-5afbf1708beb07bc91fd45014394d5eeef1877a359d49ad78a3cf9d4b8a567c0 WatchSource:0}: Error finding container 5afbf1708beb07bc91fd45014394d5eeef1877a359d49ad78a3cf9d4b8a567c0: Status 404 returned error can't find the container with id 5afbf1708beb07bc91fd45014394d5eeef1877a359d49ad78a3cf9d4b8a567c0 Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.597179 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.816045 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" event={"ID":"aece5dae-37d5-44cc-b83c-611712686fbb","Type":"ContainerStarted","Data":"5afbf1708beb07bc91fd45014394d5eeef1877a359d49ad78a3cf9d4b8a567c0"} Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.818113 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" event={"ID":"131678a2-c80e-4990-85ef-9f8ed80cb4da","Type":"ContainerStarted","Data":"5bfe3fa5ee960b88053ab7861fc06640673d8093837fc0d75b567d2f2c44d489"} Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.819693 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" event={"ID":"cd42cabb-2afc-45b4-aecd-9d97980e0840","Type":"ContainerStarted","Data":"d47c1ffeb24d5ecb2e37565ac35174ded1536ff89807e312390ecbb77ff85ec0"} Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.820961 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" event={"ID":"56d9b46f-821f-44dd-8139-4b3d1c2b1149","Type":"ContainerStarted","Data":"d0949644d43216acdfc62272be3913ada5cbb2d955ee928a7bd80b29a2bad773"} Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.823771 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" event={"ID":"cdf5d013-2e3a-41f5-ad36-c870b219a572","Type":"ContainerStarted","Data":"f7162e73a1b2b985f57fd45b81a799c3cb8feca46e06dd362e24181912a78385"} Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.825038 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" event={"ID":"d32146d6-c9aa-4864-bab5-71beebc6c6ef","Type":"ContainerStarted","Data":"4fb6b2f52c3a65f9e1e2398accca1799d8f7b1ca16ec2c9cbfd3a7c9b4accdbb"} Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.884315 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp"] Oct 02 16:34:40 crc kubenswrapper[4882]: W1002 16:34:40.891407 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee0bda3_847c_41b1_bb44_b2bd8875c9a1.slice/crio-45455f5a3fe678d50a4275d82b307122e531cdf60d1bde66839b81481a02db2d WatchSource:0}: Error finding container 45455f5a3fe678d50a4275d82b307122e531cdf60d1bde66839b81481a02db2d: Status 404 returned error can't find the container with id 45455f5a3fe678d50a4275d82b307122e531cdf60d1bde66839b81481a02db2d Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.897892 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj"] Oct 02 16:34:40 crc kubenswrapper[4882]: W1002 16:34:40.898668 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c520177_6db1_41dd_808a_49f676f9870c.slice/crio-3d89355fe6895050c25dd434a8a1f624ce2455e770bc213a748118678bcadbb5 WatchSource:0}: Error finding container 3d89355fe6895050c25dd434a8a1f624ce2455e770bc213a748118678bcadbb5: Status 404 returned error can't find the container with id 3d89355fe6895050c25dd434a8a1f624ce2455e770bc213a748118678bcadbb5 Oct 02 16:34:40 crc kubenswrapper[4882]: W1002 16:34:40.901386 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e8c312_5fcf_4106_953f_85077c2485aa.slice/crio-13c16b4b0ba7dde79d5232731432a01994796764f821fd591103aff97bc9894d WatchSource:0}: Error finding container 13c16b4b0ba7dde79d5232731432a01994796764f821fd591103aff97bc9894d: Status 404 returned error can't find the container with id 13c16b4b0ba7dde79d5232731432a01994796764f821fd591103aff97bc9894d Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.913229 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.918388 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.957559 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.963182 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr"] Oct 02 16:34:40 crc kubenswrapper[4882]: W1002 16:34:40.964947 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f23924_224e_40d8_901a_bbb56f30163d.slice/crio-cbe03bae4f8efdd16e085381b429f91fc4bf1ce891cb17819a13afd97192fe6d WatchSource:0}: Error finding container cbe03bae4f8efdd16e085381b429f91fc4bf1ce891cb17819a13afd97192fe6d: Status 404 returned error can't find the container with id cbe03bae4f8efdd16e085381b429f91fc4bf1ce891cb17819a13afd97192fe6d Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.977313 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt"] Oct 02 16:34:40 crc kubenswrapper[4882]: W1002 16:34:40.982263 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b1d8fdd_7f24_41c3_a604_7bd8df0972c4.slice/crio-0e83f117e4a7740289f7549fd20920e9813e73966be060ddb80cacaaa65bd51b WatchSource:0}: Error finding container 0e83f117e4a7740289f7549fd20920e9813e73966be060ddb80cacaaa65bd51b: Status 404 returned error can't find the container with id 0e83f117e4a7740289f7549fd20920e9813e73966be060ddb80cacaaa65bd51b Oct 02 16:34:40 crc kubenswrapper[4882]: W1002 16:34:40.983505 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f5b9997_c12c_4ba9_9efe_649f9d03e52e.slice/crio-318ac8b9a25e2c6bee5414fc71eb285bbe2c3b9874e38d911b3effdafa05be66 WatchSource:0}: Error finding container 318ac8b9a25e2c6bee5414fc71eb285bbe2c3b9874e38d911b3effdafa05be66: Status 404 returned error can't find the container with id 318ac8b9a25e2c6bee5414fc71eb285bbe2c3b9874e38d911b3effdafa05be66 Oct 02 16:34:40 crc kubenswrapper[4882]: W1002 16:34:40.984989 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda430ae1a_9023_4c9d_a4ee_ed474e3bbd88.slice/crio-b58ca174c7e5bb779be49e203d0c4db858c558144190d8fab3eb6163ff3ca2e6 WatchSource:0}: Error finding container b58ca174c7e5bb779be49e203d0c4db858c558144190d8fab3eb6163ff3ca2e6: Status 404 returned error can't find the container with id b58ca174c7e5bb779be49e203d0c4db858c558144190d8fab3eb6163ff3ca2e6 Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.985970 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-25zdk"] Oct 02 16:34:40 crc kubenswrapper[4882]: I1002 16:34:40.996417 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg"] Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.003718 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b96xr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-b4444585c-dfckg_openstack-operators(a50abab9-485d-40a2-80d7-5d3134a14908): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 16:34:41 crc kubenswrapper[4882]: W1002 16:34:41.110688 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod922b0b10_8ea8_4b15_b170_561690a29ff1.slice/crio-91b70d720171cdf25faa494213f2c484bc8ba09962dc31f168ae4590b598b3d4 WatchSource:0}: Error finding container 91b70d720171cdf25faa494213f2c484bc8ba09962dc31f168ae4590b598b3d4: Status 404 returned error can't find the container with id 91b70d720171cdf25faa494213f2c484bc8ba09962dc31f168ae4590b598b3d4 Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.112314 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz"] Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.115746 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlfb8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5595cf6c95-5sm69_openstack-operators(922b0b10-8ea8-4b15-b170-561690a29ff1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.116972 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vlxb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7c9978f67-lgpvs_openstack-operators(8b6c43c1-61b7-42ea-b66e-a19e54e67b50): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.124259 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69"] Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.131298 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs"] Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.143051 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq"] Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.145086 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b6c26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6bb6dcddc-twxqz_openstack-operators(7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.148120 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh"] Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.152011 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx"] Oct 02 16:34:41 crc kubenswrapper[4882]: W1002 16:34:41.177026 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode05cf5dc_1295_4227_9527_f22cac2d26a0.slice/crio-14b4169651544db576a2a76281d6076ddd8b7ff9a7717ce9c2bf4dfdfb1c6df4 WatchSource:0}: Error finding container 14b4169651544db576a2a76281d6076ddd8b7ff9a7717ce9c2bf4dfdfb1c6df4: Status 404 returned error can't find the container with id 14b4169651544db576a2a76281d6076ddd8b7ff9a7717ce9c2bf4dfdfb1c6df4 Oct 02 16:34:41 crc kubenswrapper[4882]: W1002 16:34:41.177803 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod968e86ab_e082_4b95_a520_67672bc1d662.slice/crio-52fe1a802f5e1ddd2fcd11d475029d24f7fade23c6b7a33595407974335ca05c WatchSource:0}: Error finding container 52fe1a802f5e1ddd2fcd11d475029d24f7fade23c6b7a33595407974335ca05c: Status 404 returned error can't find the container with id 52fe1a802f5e1ddd2fcd11d475029d24f7fade23c6b7a33595407974335ca05c Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.180931 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zxvsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-855d7949fc-tqmjh_openstack-operators(e05cf5dc-1295-4227-9527-f22cac2d26a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 16:34:41 crc kubenswrapper[4882]: W1002 16:34:41.181093 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a6df481_1b16_40a6_b98a_efd2172c77d2.slice/crio-dd3dfc1c92b841b3958c90d7bfe86f93a4c1d4482a77226d513537426bb114e8 WatchSource:0}: Error finding container dd3dfc1c92b841b3958c90d7bfe86f93a4c1d4482a77226d513537426bb114e8: Status 404 returned error can't find the container with id dd3dfc1c92b841b3958c90d7bfe86f93a4c1d4482a77226d513537426bb114e8 Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.193937 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h6s7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-27tjx_openstack-operators(6a6df481-1b16-40a6-b98a-efd2172c77d2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.195374 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" podUID="6a6df481-1b16-40a6-b98a-efd2172c77d2" Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.257609 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" podUID="a50abab9-485d-40a2-80d7-5d3134a14908" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.296116 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4"] Oct 02 16:34:41 crc kubenswrapper[4882]: W1002 16:34:41.305556 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a766286_4d8e_45ce_b444_3dcf4cd9bf57.slice/crio-1826065819f850a765fd55e4f21b22d22cf4e8c92d39b124b89ae23c36e009ab WatchSource:0}: Error finding container 1826065819f850a765fd55e4f21b22d22cf4e8c92d39b124b89ae23c36e009ab: Status 404 returned error can't find the container with id 1826065819f850a765fd55e4f21b22d22cf4e8c92d39b124b89ae23c36e009ab Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.308452 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:d1fad97d2cd602a4f7b6fd6c202464ac117b20e6608c17aa04cadbceb78a498d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:1c99923410d4cd0a721d2cc8a51d91d3ac800d5fda508c972ebe1e85ed2ca4d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:af4e2467469edf3b1fa739ef819ead98dfa934542ae40ec3266d58f66ba44f99,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:99f246f3b9bad7c46b671da12cd166614f0573b3dbf0aa04f4b32d4a9f5a81c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:d617f09ab1f6ef522c6f70db597cf20ab79ccebf25e225653cbf2e999354a5c0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:1c73b7b1034524ecfb36ce1eaa37ecbbcd5cb3f7fee0149b3bce0b0170bae8ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:9e14abeaab473b6731830d9c5bf383bb52111c919c787aee06b833f8cd3f83b1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:0838a5c5edf54c1c8af59c93955f26e4eda6645297058780e0f61c77b65683d9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:c50baa554100db160210b65733f71d6d128e38f96fa0552819854c62ede75953,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:e43273f867316a0e03469d82dc37487d3cdd2b08b4a153ba270c7cae1749bf92,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:de50c7dd282aa3898f1d0a31ecb2a300688f1f234662e6bbe12f35f88b484083,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:31c0d98fec7ff16416903874af0addeff03a7e72ede256990f2a71589e8be5ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:ac586b71d28a6240b29f4b464b19fea812ffc81e1182d172570b4be5ac58ea70,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:a5df039c808a65a273073128a627d6700897d6ebf81a9c62412c7d06be3b9a6e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:8f09cdc578caa07e0b5a9ec4e96a251a6d7dd43b2ef1edacb56543c997c259e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:e870d0a1b0c758601a067bfccc539ca04222e0c867872f679cea5833e0fcbf94,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:8f112731484f983f272f4c95558ffa098e96e610ddc5130ee0f2b2a239e9058a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:277ac4620d95ce3fe2f552f59b82b70962ba024d498710adc45b863bcc7244ff,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:09eebb1f87217fbb0249f4ebc19192cd282833aac27103081160b8949dd4361c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:e17eb4221e8981df97744e5168a8c759abcd925c2a483d04e3fdecd78128dae4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:02f99d84c8cc2c59ac4b8d98f219a1138b0aed8e50f91f9326ef55db5c187cd8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:7a636c7f518127d4292aa5417113fd611b85ad49ddbc8273455aa2fe5066a533,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:61f617fd809b55b2eceeec84b3283757af80d1001659e80877ac69e9643ba89f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:0b083fceb6e323a30f4c7308a275ea88243420ef38df77ac322af302c4c4dd2d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:9e173574f9216e5c42498c3794075ead54b6850c66094c4be628b52063f5814c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:980d0d43a83e61b74634b46864c2070fcb26348f8bc5a3375f161703e4041d3d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:d561737cf54869c67a819635c4a10ca4a9ed21cc6046ffd4f17301670d9a25fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:941076bbb1577abd91f42e0f19b0a191f7e393135d823ed203b122875033888b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:2133db6669a24570a266e7c053fc71bbfadd16cd9cd0bc8b87633e73c03c4719,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:55682010f0f5aea02f59df1e0a827cc6915048b7545c25432fb0cb8501898d0b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:814536e8e4848f6612cd4ada641d46ae7d766878b89918fc5df11f3930747d3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:a4f12a27e60f17034ba47f57dba0c5ae3f9e3c6c681f2e417bb87cb132f502e7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2069e730d5ced0e278392077ad261a3c35bf5df1d88735441859f23e8e3ceb24,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:17b8c6c9fbcc7092cba64a264adb9af6decd7db24ee2c60607a9045d55031b51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:f0864392605772b30f07dcb67ec8bb75d5b779756c537983377044d899c1b099,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:d9a44db937205e4c4f2cd2d247d230de2eb9207089f35a7ae7cfb11301406fac,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:1ab1deb86e7e5ba67b4cd9f5974de6707e5a5948e8f01fc1156dbf5e452340a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:a895c2b3a12aa21f9541a76213b6058ce3252aca002d66025d5935f4ea5873c7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:e7c778fd348f881fea490bb9ddf465347068a60fcd65f9cbfedb615815bba2a2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:a21c91d6927d863be8aef3023a527bc3466a0ddffc018df0c970ce14396ceee0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:7053c79b8354195fd09a5ea1347ad49a35443923d4e4578f80615c63d83313d3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:4626ebaa9dbe27fc95b31a48e69397fadef7c9779670c01555f872873c393f74,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:c840d7e9775d7f7ed1c6700d973bef79318fe92ac6fc8ed0616dcec13ef95c92,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:fcb50aade382ff516554b84b45c742a5adafb460fd67bd0fa2fc7cbb30adf5c1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:54373b05fcd33538d153507943da0c118e303a5c61a19c6bbe79a0786fe8ce1d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:8c9be58185245280d7282e8973cc6e23e6b08520ce126aeb91cfbcef0c144690,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:676ba6130835d00defc3214769d5fe1827ee41420a05f8556f361aac502a7efc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:3dbd2ac58b5f64ab3cf3eef3c44a52f0ccd363568c0739a5d18d6b9c9edddf5d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:fded6f454a54e601894e06989243e8896f43940c77cd8f4c904fe43c120b1595,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:5d10c016b13499110b5f9ca2bccfaf6d2fd4298c9f02580d7208fe91850da0a6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:43f2c4ec2e38934288015cb5d5ae92941e8b3fa9a613539175641e2c16cfc0cc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:c506a314e354f1ab274c46f9969b254f820e7515bbd9a24c9877dfbb10ece37e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:96d4b699758dd3d408b4c672dbe4392fd09783b4dc60783389905d7220b6524c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:7a0f3de7dda85fba7ad2929c7b01a2d42c11df9fe83f47a8e499a9da51e7f48c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:49b5ae7f895266b90cf3c02503fb7146726e59ad782fdf88112ad6954112d7e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:19b3d48b3c29eaa3a6d76fc145e212389f245c077bbf24eb5c1de0c96f3f7190,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:227891a9f4821a92c49ddc27301303287d5632b6ac199e9fe402581f1831ec01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:26e3ada4b9fee357ef8bbb1c342b38c49c096ede8a498116e3753ad45354fb47,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:2789b45ae2a5a9a80e4864e691f9e32fb9c9e1938cf92bda7c07defdbc78cdc2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:8df8259e737625667b13897dc0094bf3d7ced54f414dda93293ad4cb68af1d43,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:2fe4f8c71e11a926450d6553e5cb5c7b2db5d0de8426aa969f30d3d566114ff8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:ab5265aef98352336f23b18080f3ba110250859dc0edc20819348311a4a53044,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:bf42dfd2e225818662aa28c4bb23204dc47b2b91127ca0e49b085baa1ea7609d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:a3c1b94a285064d150145340c06ad5b0afc4aa20caa74523f3972c19b1d1ea61,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:129e24971fee94cc60b5f440605f1512fb932a884e38e64122f38f11f942e3b9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:b96baffbb926f93936bd52f2a1ef4fe1d31bb469d6489e9fb67bf00b99156551,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:d659d1ffbbaff7c76fc96e6600dc9b03c53af2c9d63cfb4626dfb5831b7b1ad7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:e3fcd72e1a2790ca7db5d5c40c1ae597de4b020dd51debcab063352e6e5f7d79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:2504c0db038b850cdd6057fc50e109715a4453c386e4f4d4f901a20dc7b2036a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:35c124624fd84930496975032e22d57e517c5958e71ba63124a306a5949c71d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e3accbf4293c544194bd2151d4d0bd8b26828ddacda968bad5d5a6f05c2406db,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:75ce8c4f9c68aaba6cab59749e726b2f94d29ba7b7897b18112fe1bd350efd8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:6390af78808d7cd69a4f5c7cb88f47690e54c9b8838b9461f4b21c4127ce770c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:14489a8a681c482a643cb47fa90d0a3596b4570e13cfc760541ac80d37cd31b3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:87367a67c7cb73476fb8d08ba108da843ac61170381458608e778a33c024c0c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:d6123a9349d422888df97ee72d32643dd534f81c521f6f313c5d5e64e2db60c1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:b273fd1e1da4190dc4cc67469d180b66b5a22eb6ec9afc76ef36dd6ea2beaea5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:9561306ec9455914cd05a0a0b3e56d72c7164aa41d0f0ef9b03ac7d7343538b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:1115e5a2dce397b4a34a082cba1937903818ab5928048fcf775c4a4e6dda2d07,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jgcvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-7774cdf765ssbw4_openstack-operators(7a766286-4d8e-45ce-b444-3dcf4cd9bf57): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.453768 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" podUID="922b0b10-8ea8-4b15-b170-561690a29ff1" Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.483898 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" podUID="7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7" Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.492812 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" podUID="7a766286-4d8e-45ce-b444-3dcf4cd9bf57" Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.496941 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" podUID="8b6c43c1-61b7-42ea-b66e-a19e54e67b50" Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.528592 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" podUID="e05cf5dc-1295-4227-9527-f22cac2d26a0" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.550971 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert\") pod \"openstack-operator-controller-manager-6c9cff6d55-xtjhj\" (UID: \"96795030-dccf-45ce-9aa5-6104145b247d\") " pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.569447 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96795030-dccf-45ce-9aa5-6104145b247d-cert\") pod \"openstack-operator-controller-manager-6c9cff6d55-xtjhj\" (UID: \"96795030-dccf-45ce-9aa5-6104145b247d\") " pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.700320 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.841293 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" event={"ID":"08f8538a-699b-4ef0-9e48-da37040acdeb","Type":"ContainerStarted","Data":"e428db4b17c036fb9415df70f0d712bcdd0d137a32d91dc605118a54f052bb80"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.847710 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" event={"ID":"e05cf5dc-1295-4227-9527-f22cac2d26a0","Type":"ContainerStarted","Data":"81fa995398ea3b0b9757cda34bbf20079be039b21b08f4087208b560ab47f7e1"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.847759 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" event={"ID":"e05cf5dc-1295-4227-9527-f22cac2d26a0","Type":"ContainerStarted","Data":"14b4169651544db576a2a76281d6076ddd8b7ff9a7717ce9c2bf4dfdfb1c6df4"} Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.849016 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" podUID="e05cf5dc-1295-4227-9527-f22cac2d26a0" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.849383 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" event={"ID":"7a766286-4d8e-45ce-b444-3dcf4cd9bf57","Type":"ContainerStarted","Data":"ae440d783b757e2f34bd205ad83cd42c00ad15ccdbcf38b0ad461d547f2fe7d2"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.849415 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" event={"ID":"7a766286-4d8e-45ce-b444-3dcf4cd9bf57","Type":"ContainerStarted","Data":"1826065819f850a765fd55e4f21b22d22cf4e8c92d39b124b89ae23c36e009ab"} Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.851242 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" podUID="7a766286-4d8e-45ce-b444-3dcf4cd9bf57" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.852977 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" event={"ID":"6a6df481-1b16-40a6-b98a-efd2172c77d2","Type":"ContainerStarted","Data":"dd3dfc1c92b841b3958c90d7bfe86f93a4c1d4482a77226d513537426bb114e8"} Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.854281 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" podUID="6a6df481-1b16-40a6-b98a-efd2172c77d2" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.855840 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" event={"ID":"2c520177-6db1-41dd-808a-49f676f9870c","Type":"ContainerStarted","Data":"3d89355fe6895050c25dd434a8a1f624ce2455e770bc213a748118678bcadbb5"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.858829 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" event={"ID":"63f23924-224e-40d8-901a-bbb56f30163d","Type":"ContainerStarted","Data":"cbe03bae4f8efdd16e085381b429f91fc4bf1ce891cb17819a13afd97192fe6d"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.860366 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" event={"ID":"a50abab9-485d-40a2-80d7-5d3134a14908","Type":"ContainerStarted","Data":"20296b5909482989c17e2fb8f612bdbdcb78b0211595e20fa0a625b1b3d2b553"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.860426 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" event={"ID":"a50abab9-485d-40a2-80d7-5d3134a14908","Type":"ContainerStarted","Data":"627c5fd7704b00e79e3a34178f492b4f7797efc0df3b5c4e03bfa3bff7768fbe"} Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.861479 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" podUID="a50abab9-485d-40a2-80d7-5d3134a14908" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.862074 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" event={"ID":"968e86ab-e082-4b95-a520-67672bc1d662","Type":"ContainerStarted","Data":"52fe1a802f5e1ddd2fcd11d475029d24f7fade23c6b7a33595407974335ca05c"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.866757 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" event={"ID":"8b1d8fdd-7f24-41c3-a604-7bd8df0972c4","Type":"ContainerStarted","Data":"0e83f117e4a7740289f7549fd20920e9813e73966be060ddb80cacaaa65bd51b"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.871392 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" event={"ID":"7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7","Type":"ContainerStarted","Data":"999704a7ded8e209701d74d463caa505c7e05e1bf84be3a5fac72357a07f3ee8"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.871434 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" event={"ID":"7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7","Type":"ContainerStarted","Data":"cbed666bbdf9a3b153aa2d2bf4c4ed9a2482835e1d6ac389128d89a137e77583"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.878810 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" event={"ID":"a430ae1a-9023-4c9d-a4ee-ed474e3bbd88","Type":"ContainerStarted","Data":"b58ca174c7e5bb779be49e203d0c4db858c558144190d8fab3eb6163ff3ca2e6"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.885284 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" event={"ID":"aee0bda3-847c-41b1-bb44-b2bd8875c9a1","Type":"ContainerStarted","Data":"45455f5a3fe678d50a4275d82b307122e531cdf60d1bde66839b81481a02db2d"} Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.893486 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" podUID="7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.898424 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" event={"ID":"75e8c312-5fcf-4106-953f-85077c2485aa","Type":"ContainerStarted","Data":"13c16b4b0ba7dde79d5232731432a01994796764f821fd591103aff97bc9894d"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.905385 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" event={"ID":"8b6c43c1-61b7-42ea-b66e-a19e54e67b50","Type":"ContainerStarted","Data":"926821c1f95b9a4db8c8d71979ac9c45804c82583af4add344b14d1429852e78"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.905434 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" event={"ID":"8b6c43c1-61b7-42ea-b66e-a19e54e67b50","Type":"ContainerStarted","Data":"ff209f7decfdd0048fc559dc6a51f876639ad4cddd4fc04cae87c7934a45e0a0"} Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.915705 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" podUID="8b6c43c1-61b7-42ea-b66e-a19e54e67b50" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.967693 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" event={"ID":"922b0b10-8ea8-4b15-b170-561690a29ff1","Type":"ContainerStarted","Data":"007e735718ad758cb74cc19240bee3aa2f9371214cbbbdcf665f66c6eeacd60a"} Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.967743 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" event={"ID":"922b0b10-8ea8-4b15-b170-561690a29ff1","Type":"ContainerStarted","Data":"91b70d720171cdf25faa494213f2c484bc8ba09962dc31f168ae4590b598b3d4"} Oct 02 16:34:41 crc kubenswrapper[4882]: E1002 16:34:41.972462 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" podUID="922b0b10-8ea8-4b15-b170-561690a29ff1" Oct 02 16:34:41 crc kubenswrapper[4882]: I1002 16:34:41.975127 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" event={"ID":"8f5b9997-c12c-4ba9-9efe-649f9d03e52e","Type":"ContainerStarted","Data":"318ac8b9a25e2c6bee5414fc71eb285bbe2c3b9874e38d911b3effdafa05be66"} Oct 02 16:34:42 crc kubenswrapper[4882]: I1002 16:34:42.292304 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj"] Oct 02 16:34:42 crc kubenswrapper[4882]: W1002 16:34:42.332892 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96795030_dccf_45ce_9aa5_6104145b247d.slice/crio-da215ad7b77b3c946fbd5039a5b0af73f64eaaed2fe57fff079c486cdcc22652 WatchSource:0}: Error finding container da215ad7b77b3c946fbd5039a5b0af73f64eaaed2fe57fff079c486cdcc22652: Status 404 returned error can't find the container with id da215ad7b77b3c946fbd5039a5b0af73f64eaaed2fe57fff079c486cdcc22652 Oct 02 16:34:42 crc kubenswrapper[4882]: I1002 16:34:42.992501 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" event={"ID":"96795030-dccf-45ce-9aa5-6104145b247d","Type":"ContainerStarted","Data":"40c1fc6c057c89a6e998a4c2af11d0a7fd10f47a991d50058dc0666e75e5d635"} Oct 02 16:34:42 crc kubenswrapper[4882]: I1002 16:34:42.992829 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" event={"ID":"96795030-dccf-45ce-9aa5-6104145b247d","Type":"ContainerStarted","Data":"4bb341804809f2d41afa87f28827ea4f3f29f33d6767829366e17f1c7e064d0f"} Oct 02 16:34:42 crc kubenswrapper[4882]: I1002 16:34:42.992846 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:42 crc kubenswrapper[4882]: I1002 16:34:42.992856 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" event={"ID":"96795030-dccf-45ce-9aa5-6104145b247d","Type":"ContainerStarted","Data":"da215ad7b77b3c946fbd5039a5b0af73f64eaaed2fe57fff079c486cdcc22652"} Oct 02 16:34:42 crc kubenswrapper[4882]: E1002 16:34:42.993234 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" podUID="6a6df481-1b16-40a6-b98a-efd2172c77d2" Oct 02 16:34:42 crc kubenswrapper[4882]: E1002 16:34:42.993760 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b7409dcf05c85eab205904d29d4276f8e927c772eba6363ecfa21ab10c4aaa01\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" podUID="a50abab9-485d-40a2-80d7-5d3134a14908" Oct 02 16:34:42 crc kubenswrapper[4882]: E1002 16:34:42.993831 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" podUID="922b0b10-8ea8-4b15-b170-561690a29ff1" Oct 02 16:34:43 crc kubenswrapper[4882]: E1002 16:34:43.003439 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" podUID="e05cf5dc-1295-4227-9527-f22cac2d26a0" Oct 02 16:34:43 crc kubenswrapper[4882]: E1002 16:34:43.003534 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" podUID="7a766286-4d8e-45ce-b444-3dcf4cd9bf57" Oct 02 16:34:43 crc kubenswrapper[4882]: E1002 16:34:43.003580 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" podUID="7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7" Oct 02 16:34:43 crc kubenswrapper[4882]: E1002 16:34:43.003617 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" podUID="8b6c43c1-61b7-42ea-b66e-a19e54e67b50" Oct 02 16:34:43 crc kubenswrapper[4882]: I1002 16:34:43.228921 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" podStartSLOduration=4.228904346 podStartE2EDuration="4.228904346s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:34:43.226415605 +0000 UTC m=+1041.975645132" watchObservedRunningTime="2025-10-02 16:34:43.228904346 +0000 UTC m=+1041.978133873" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.043067 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" event={"ID":"968e86ab-e082-4b95-a520-67672bc1d662","Type":"ContainerStarted","Data":"13ee119efc7ad8246e812a81746b6854130dc50d28c40a9db672e461d6153cfb"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.043754 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" event={"ID":"968e86ab-e082-4b95-a520-67672bc1d662","Type":"ContainerStarted","Data":"f23e91df7d3346efceb85630813d3e29597e13bc5ad29572c56ffd808727e5c8"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.043872 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.046328 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" event={"ID":"8b1d8fdd-7f24-41c3-a604-7bd8df0972c4","Type":"ContainerStarted","Data":"9541729fa2173b8793196ea36c6c0263a86810f7dbc8be28c4c7d122f143dfe0"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.046378 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" event={"ID":"8b1d8fdd-7f24-41c3-a604-7bd8df0972c4","Type":"ContainerStarted","Data":"f74ff2e8beb83bdc2647a1468ca6639088a70e51c9253f9af6a1e454c077cb48"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.047081 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.049312 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" event={"ID":"aece5dae-37d5-44cc-b83c-611712686fbb","Type":"ContainerStarted","Data":"be0979b11d68e868e05ddabb4ada8ecedfc0527411827527495f9b0bb2086f51"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.049358 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" event={"ID":"aece5dae-37d5-44cc-b83c-611712686fbb","Type":"ContainerStarted","Data":"a59edd6fabfacfeb5ab6cf1f6ddcccf8ac8a3f1f0d5dfb0f9641eb6df560a846"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.049511 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.051354 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" event={"ID":"aee0bda3-847c-41b1-bb44-b2bd8875c9a1","Type":"ContainerStarted","Data":"cbea32d18bf3f762617f7fdd605cec9a07e7231fa7191b935aae05d8ee98dfd4"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.051394 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" event={"ID":"aee0bda3-847c-41b1-bb44-b2bd8875c9a1","Type":"ContainerStarted","Data":"162540e264aae9210105ff942392d990265bd8b2700cbf0fc442f9b0fd08a37c"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.051485 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.053103 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" event={"ID":"75e8c312-5fcf-4106-953f-85077c2485aa","Type":"ContainerStarted","Data":"201c3a4bdfe8fe02990084ba5777215a126ebe2954a5ec17d02bd46b79454044"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.053138 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" event={"ID":"75e8c312-5fcf-4106-953f-85077c2485aa","Type":"ContainerStarted","Data":"1a6e263e90199bb769dd1eff1735cffaf19a64bdfdbb274c3d1c79f2e0637773"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.053708 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.056604 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" event={"ID":"56d9b46f-821f-44dd-8139-4b3d1c2b1149","Type":"ContainerStarted","Data":"29b2be3d36806661fcc8a307fb4088744fc7d6dbcca7aee5a5ccf2984295b240"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.056641 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" event={"ID":"56d9b46f-821f-44dd-8139-4b3d1c2b1149","Type":"ContainerStarted","Data":"dc144b354cf50b4e544cb7cc6ed03ba4107e0da6cd0c7019444008b33afae257"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.057169 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.061533 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" event={"ID":"63f23924-224e-40d8-901a-bbb56f30163d","Type":"ContainerStarted","Data":"0b15faa23c8dddee2f359281deede0921ab528e3b29b002f130a089557a1cb0e"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.061922 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.061936 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" event={"ID":"63f23924-224e-40d8-901a-bbb56f30163d","Type":"ContainerStarted","Data":"ed4cf4284ba2855917f13f3885dd42e7c03466973e75a363d40f8adca0a514bf"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.063404 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" event={"ID":"08f8538a-699b-4ef0-9e48-da37040acdeb","Type":"ContainerStarted","Data":"a87e142cf9427f9719e841409c732adc595c7666f8ad735b62889a331172609e"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.063449 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" event={"ID":"08f8538a-699b-4ef0-9e48-da37040acdeb","Type":"ContainerStarted","Data":"bcdabff3b5bef78358ce2d669023e1bf89a9ff9f4fc5fe75a3aa4d185a6f7084"} Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.063919 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.070046 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" podStartSLOduration=3.489164579 podStartE2EDuration="9.070030242s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:41.180651378 +0000 UTC m=+1039.929880905" lastFinishedPulling="2025-10-02 16:34:46.761517041 +0000 UTC m=+1045.510746568" observedRunningTime="2025-10-02 16:34:48.069407577 +0000 UTC m=+1046.818637094" watchObservedRunningTime="2025-10-02 16:34:48.070030242 +0000 UTC m=+1046.819259769" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.089230 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" podStartSLOduration=3.928601638 podStartE2EDuration="10.089190277s" podCreationTimestamp="2025-10-02 16:34:38 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.581331107 +0000 UTC m=+1039.330560634" lastFinishedPulling="2025-10-02 16:34:46.741919716 +0000 UTC m=+1045.491149273" observedRunningTime="2025-10-02 16:34:48.085769802 +0000 UTC m=+1046.834999349" watchObservedRunningTime="2025-10-02 16:34:48.089190277 +0000 UTC m=+1046.838419804" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.107936 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" podStartSLOduration=2.870432987 podStartE2EDuration="9.10791186s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.502902025 +0000 UTC m=+1039.252131552" lastFinishedPulling="2025-10-02 16:34:46.740380858 +0000 UTC m=+1045.489610425" observedRunningTime="2025-10-02 16:34:48.107469179 +0000 UTC m=+1046.856698706" watchObservedRunningTime="2025-10-02 16:34:48.10791186 +0000 UTC m=+1046.857141387" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.142742 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" podStartSLOduration=3.387818829 podStartE2EDuration="9.142680931s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.988190821 +0000 UTC m=+1039.737420348" lastFinishedPulling="2025-10-02 16:34:46.743052883 +0000 UTC m=+1045.492282450" observedRunningTime="2025-10-02 16:34:48.128299815 +0000 UTC m=+1046.877529362" watchObservedRunningTime="2025-10-02 16:34:48.142680931 +0000 UTC m=+1046.891910458" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.161261 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" podStartSLOduration=3.386043596 podStartE2EDuration="9.161244601s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.966557046 +0000 UTC m=+1039.715786573" lastFinishedPulling="2025-10-02 16:34:46.741758021 +0000 UTC m=+1045.490987578" observedRunningTime="2025-10-02 16:34:48.159172 +0000 UTC m=+1046.908401527" watchObservedRunningTime="2025-10-02 16:34:48.161244601 +0000 UTC m=+1046.910474128" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.204651 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" podStartSLOduration=3.353290325 podStartE2EDuration="9.204621865s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.899098536 +0000 UTC m=+1039.648328063" lastFinishedPulling="2025-10-02 16:34:46.750430076 +0000 UTC m=+1045.499659603" observedRunningTime="2025-10-02 16:34:48.186866196 +0000 UTC m=+1046.936095723" watchObservedRunningTime="2025-10-02 16:34:48.204621865 +0000 UTC m=+1046.953851392" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.206242 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" podStartSLOduration=3.361991009 podStartE2EDuration="9.206235904s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.894688056 +0000 UTC m=+1039.643917583" lastFinishedPulling="2025-10-02 16:34:46.738932911 +0000 UTC m=+1045.488162478" observedRunningTime="2025-10-02 16:34:48.203477386 +0000 UTC m=+1046.952706913" watchObservedRunningTime="2025-10-02 16:34:48.206235904 +0000 UTC m=+1046.955465431" Oct 02 16:34:48 crc kubenswrapper[4882]: I1002 16:34:48.227654 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" podStartSLOduration=3.404827371 podStartE2EDuration="9.227639035s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.917675746 +0000 UTC m=+1039.666905273" lastFinishedPulling="2025-10-02 16:34:46.74048738 +0000 UTC m=+1045.489716937" observedRunningTime="2025-10-02 16:34:48.224370004 +0000 UTC m=+1046.973599531" watchObservedRunningTime="2025-10-02 16:34:48.227639035 +0000 UTC m=+1046.976868562" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.545437 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openstack-k8s-operators/cinder-operator@sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b: reading manifest sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b in quay.io/openstack-k8s-operators/cinder-operator: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.545675 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxbpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8686fd99f7-nwf78_openstack-operators(cd42cabb-2afc-45b4-aecd-9d97980e0840): ErrImagePull: initializing source docker://quay.io/openstack-k8s-operators/cinder-operator@sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b: reading manifest sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b in quay.io/openstack-k8s-operators/cinder-operator: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.612085 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:91e11b31f0c969125d9883b1f765e7c99a62f639b11fab568dec82b38f8cfe74" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.612755 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:91e11b31f0c969125d9883b1f765e7c99a62f639b11fab568dec82b38f8cfe74,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sbl2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6d6d64fdcf-mx2hz_openstack-operators(d32146d6-c9aa-4864-bab5-71beebc6c6ef): ErrImagePull: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.712555 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"initializing source docker://quay.io/openstack-k8s-operators/cinder-operator@sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b: reading manifest sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b in quay.io/openstack-k8s-operators/cinder-operator: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" podUID="cd42cabb-2afc-45b4-aecd-9d97980e0840" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.731360 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/openstack-k8s-operators/glance-operator@sha256:9fed055cd1f09627ef351e61c7e42227570193ccd5d33167a607c49b442a9d87" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.731516 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:9fed055cd1f09627ef351e61c7e42227570193ccd5d33167a607c49b442a9d87,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p25gh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-d785ddfd5-2sngh_openstack-operators(cdf5d013-2e3a-41f5-ad36-c870b219a572): ErrImagePull: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.771075 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" podUID="d32146d6-c9aa-4864-bab5-71beebc6c6ef" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.862413 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" podUID="cdf5d013-2e3a-41f5-ad36-c870b219a572" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.923911 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7" Oct 02 16:34:50 crc kubenswrapper[4882]: E1002 16:34:50.924104 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nmlq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-6c9969c6c6-bhztc_openstack-operators(131678a2-c80e-4990-85ef-9f8ed80cb4da): ErrImagePull: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.018381 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12: can't talk to a V1 container registry" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.018557 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cckvl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-59b5fc9845-krzwg_openstack-operators(2c520177-6db1-41dd-808a-49f676f9870c): ErrImagePull: initializing source docker://quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12: can't talk to a V1 container registry" logger="UnhandledError" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.048000 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" podUID="131678a2-c80e-4990-85ef-9f8ed80cb4da" Oct 02 16:34:51 crc kubenswrapper[4882]: I1002 16:34:51.084558 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" event={"ID":"cdf5d013-2e3a-41f5-ad36-c870b219a572","Type":"ContainerStarted","Data":"c1b49e44634f45ad3c55d180ba6201bf96f9630548fc819baa0abe5da1827a37"} Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.086463 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:9fed055cd1f09627ef351e61c7e42227570193ccd5d33167a607c49b442a9d87\\\"\"" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" podUID="cdf5d013-2e3a-41f5-ad36-c870b219a572" Oct 02 16:34:51 crc kubenswrapper[4882]: I1002 16:34:51.086687 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" event={"ID":"d32146d6-c9aa-4864-bab5-71beebc6c6ef","Type":"ContainerStarted","Data":"fcf93e7a5e4f8eaa6b39e75c68452b23ab1837d5f34ab39fa6f2494a9a3a9bf9"} Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.087643 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:91e11b31f0c969125d9883b1f765e7c99a62f639b11fab568dec82b38f8cfe74\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" podUID="d32146d6-c9aa-4864-bab5-71beebc6c6ef" Oct 02 16:34:51 crc kubenswrapper[4882]: I1002 16:34:51.087963 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" event={"ID":"131678a2-c80e-4990-85ef-9f8ed80cb4da","Type":"ContainerStarted","Data":"f3c75130acf40eddc03de3c39452256f78c54cbfa1c3b11fccce78ded81f5b61"} Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.089067 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" podUID="131678a2-c80e-4990-85ef-9f8ed80cb4da" Oct 02 16:34:51 crc kubenswrapper[4882]: I1002 16:34:51.089878 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" event={"ID":"cd42cabb-2afc-45b4-aecd-9d97980e0840","Type":"ContainerStarted","Data":"729a7e8b8dff0390a5091bd8cb7161c24b88f38d1e6d25f767e0373127873872"} Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.090901 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" podUID="cd42cabb-2afc-45b4-aecd-9d97980e0840" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.091284 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa: can't talk to a V1 container registry" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.091460 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nk4dc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5ffb97cddf-qmnwx_openstack-operators(a430ae1a-9023-4c9d-a4ee-ed474e3bbd88): ErrImagePull: initializing source docker://quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa: can't talk to a V1 container registry" logger="UnhandledError" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.095801 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed: can't talk to a V1 container registry" image="quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.095954 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h7wv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-76d5577b-25zdk_openstack-operators(8f5b9997-c12c-4ba9-9efe-649f9d03e52e): ErrImagePull: initializing source docker://quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed: can't talk to a V1 container registry" logger="UnhandledError" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.171274 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"initializing source docker://quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12: can't talk to a V1 container registry\"" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" podUID="2c520177-6db1-41dd-808a-49f676f9870c" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.252316 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"initializing source docker://quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa: can't talk to a V1 container registry\"" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" podUID="a430ae1a-9023-4c9d-a4ee-ed474e3bbd88" Oct 02 16:34:51 crc kubenswrapper[4882]: E1002 16:34:51.252959 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"initializing source docker://quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed: can't talk to a V1 container registry\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" podUID="8f5b9997-c12c-4ba9-9efe-649f9d03e52e" Oct 02 16:34:51 crc kubenswrapper[4882]: I1002 16:34:51.706622 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6c9cff6d55-xtjhj" Oct 02 16:34:52 crc kubenswrapper[4882]: I1002 16:34:52.101847 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" event={"ID":"8f5b9997-c12c-4ba9-9efe-649f9d03e52e","Type":"ContainerStarted","Data":"f33ab73bbb898163957c563fbb24dc75c08ca193a36176fc7f5a995ff9952a40"} Oct 02 16:34:52 crc kubenswrapper[4882]: E1002 16:34:52.104444 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" podUID="8f5b9997-c12c-4ba9-9efe-649f9d03e52e" Oct 02 16:34:52 crc kubenswrapper[4882]: I1002 16:34:52.105359 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" event={"ID":"2c520177-6db1-41dd-808a-49f676f9870c","Type":"ContainerStarted","Data":"e71d1a5a1f7f997501f8b9caf4d85eb5eef9fe91dc7567ea6a2e2ce3714f618d"} Oct 02 16:34:52 crc kubenswrapper[4882]: E1002 16:34:52.107993 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" podUID="2c520177-6db1-41dd-808a-49f676f9870c" Oct 02 16:34:52 crc kubenswrapper[4882]: I1002 16:34:52.111075 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" event={"ID":"a430ae1a-9023-4c9d-a4ee-ed474e3bbd88","Type":"ContainerStarted","Data":"449be548059b7f7444320fda114f42cefba587cf4af0459fe452a9407d6d4b3a"} Oct 02 16:34:52 crc kubenswrapper[4882]: E1002 16:34:52.111892 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:91e11b31f0c969125d9883b1f765e7c99a62f639b11fab568dec82b38f8cfe74\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" podUID="d32146d6-c9aa-4864-bab5-71beebc6c6ef" Oct 02 16:34:52 crc kubenswrapper[4882]: E1002 16:34:52.111966 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" podUID="a430ae1a-9023-4c9d-a4ee-ed474e3bbd88" Oct 02 16:34:52 crc kubenswrapper[4882]: E1002 16:34:52.112770 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" podUID="131678a2-c80e-4990-85ef-9f8ed80cb4da" Oct 02 16:34:52 crc kubenswrapper[4882]: E1002 16:34:52.112813 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:229213522e05cbd3034bb80a8ddb1c701cf5f6d74c696e8085597ef6da27ca4b\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" podUID="cd42cabb-2afc-45b4-aecd-9d97980e0840" Oct 02 16:34:52 crc kubenswrapper[4882]: E1002 16:34:52.113975 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:9fed055cd1f09627ef351e61c7e42227570193ccd5d33167a607c49b442a9d87\\\"\"" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" podUID="cdf5d013-2e3a-41f5-ad36-c870b219a572" Oct 02 16:34:53 crc kubenswrapper[4882]: E1002 16:34:53.117499 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" podUID="8f5b9997-c12c-4ba9-9efe-649f9d03e52e" Oct 02 16:34:53 crc kubenswrapper[4882]: E1002 16:34:53.117816 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" podUID="2c520177-6db1-41dd-808a-49f676f9870c" Oct 02 16:34:53 crc kubenswrapper[4882]: E1002 16:34:53.117856 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" podUID="a430ae1a-9023-4c9d-a4ee-ed474e3bbd88" Oct 02 16:34:54 crc kubenswrapper[4882]: I1002 16:34:54.761884 4882 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 16:34:59 crc kubenswrapper[4882]: I1002 16:34:59.365375 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-j4mbs" Oct 02 16:34:59 crc kubenswrapper[4882]: I1002 16:34:59.429795 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-4z9l6" Oct 02 16:34:59 crc kubenswrapper[4882]: I1002 16:34:59.469655 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-t5nnp" Oct 02 16:34:59 crc kubenswrapper[4882]: I1002 16:34:59.594082 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-qb4lj" Oct 02 16:34:59 crc kubenswrapper[4882]: I1002 16:34:59.627966 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-ddwmt" Oct 02 16:34:59 crc kubenswrapper[4882]: I1002 16:34:59.730927 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-hhcmt" Oct 02 16:34:59 crc kubenswrapper[4882]: I1002 16:34:59.762318 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-x6jfr" Oct 02 16:34:59 crc kubenswrapper[4882]: I1002 16:34:59.938735 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-hvnhq" Oct 02 16:35:00 crc kubenswrapper[4882]: I1002 16:35:00.181146 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" event={"ID":"922b0b10-8ea8-4b15-b170-561690a29ff1","Type":"ContainerStarted","Data":"ec5e19248ae159466e2300515361f09ab5109b3623c004d034672a249221fe20"} Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.195356 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" event={"ID":"a50abab9-485d-40a2-80d7-5d3134a14908","Type":"ContainerStarted","Data":"b7745ab3174ffcfbf383f948498e194344911c67eb5ecb1de7af8b1c92669ea8"} Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.196028 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.198896 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" event={"ID":"7a766286-4d8e-45ce-b444-3dcf4cd9bf57","Type":"ContainerStarted","Data":"60b9aeb19a6f359a3fcf2064aa89f21a850883ee41797b26c3f8f101498219e8"} Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.199125 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.203033 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" event={"ID":"8b6c43c1-61b7-42ea-b66e-a19e54e67b50","Type":"ContainerStarted","Data":"3bb556af1feea43d0c97f0ca5dd7e96e36bee16b92f290581202f0ecf6411f5e"} Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.203503 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.216674 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" podStartSLOduration=3.520652518 podStartE2EDuration="22.216655598s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:41.003561572 +0000 UTC m=+1039.752791109" lastFinishedPulling="2025-10-02 16:34:59.699564672 +0000 UTC m=+1058.448794189" observedRunningTime="2025-10-02 16:35:01.214084305 +0000 UTC m=+1059.963313842" watchObservedRunningTime="2025-10-02 16:35:01.216655598 +0000 UTC m=+1059.965885125" Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.253988 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" podStartSLOduration=3.925713729 podStartE2EDuration="22.253965542s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:41.30794774 +0000 UTC m=+1040.057177267" lastFinishedPulling="2025-10-02 16:34:59.636199553 +0000 UTC m=+1058.385429080" observedRunningTime="2025-10-02 16:35:01.244103077 +0000 UTC m=+1059.993332604" watchObservedRunningTime="2025-10-02 16:35:01.253965542 +0000 UTC m=+1060.003195059" Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.264416 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" podStartSLOduration=3.744220655 podStartE2EDuration="22.2643918s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:41.116880208 +0000 UTC m=+1039.866109735" lastFinishedPulling="2025-10-02 16:34:59.637051353 +0000 UTC m=+1058.386280880" observedRunningTime="2025-10-02 16:35:01.261141109 +0000 UTC m=+1060.010370636" watchObservedRunningTime="2025-10-02 16:35:01.2643918 +0000 UTC m=+1060.013621327" Oct 02 16:35:01 crc kubenswrapper[4882]: I1002 16:35:01.281027 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" podStartSLOduration=3.758890928 podStartE2EDuration="22.281007902s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:41.115628137 +0000 UTC m=+1039.864857664" lastFinishedPulling="2025-10-02 16:34:59.637745111 +0000 UTC m=+1058.386974638" observedRunningTime="2025-10-02 16:35:01.279942455 +0000 UTC m=+1060.029171982" watchObservedRunningTime="2025-10-02 16:35:01.281007902 +0000 UTC m=+1060.030237459" Oct 02 16:35:03 crc kubenswrapper[4882]: I1002 16:35:03.221749 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" event={"ID":"6a6df481-1b16-40a6-b98a-efd2172c77d2","Type":"ContainerStarted","Data":"4e083c4dd2f8f57c33bf7b350b1473f2d9959813f52e3ddb835b8d30bc5b16e8"} Oct 02 16:35:03 crc kubenswrapper[4882]: I1002 16:35:03.224124 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" event={"ID":"7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7","Type":"ContainerStarted","Data":"a22dc4a7b59de8b148db8a9c385d5c70d68d0f4bd97982f1ae838c821e9c5ec6"} Oct 02 16:35:03 crc kubenswrapper[4882]: I1002 16:35:03.225182 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" Oct 02 16:35:03 crc kubenswrapper[4882]: I1002 16:35:03.227425 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" event={"ID":"e05cf5dc-1295-4227-9527-f22cac2d26a0","Type":"ContainerStarted","Data":"59677dc6d935d2e9f26edfc10fcfe8c016e1316d98a64f54af84d6bc95bcf162"} Oct 02 16:35:03 crc kubenswrapper[4882]: I1002 16:35:03.244942 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" podStartSLOduration=2.435542228 podStartE2EDuration="24.244921922s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:41.144947853 +0000 UTC m=+1039.894177380" lastFinishedPulling="2025-10-02 16:35:02.954327557 +0000 UTC m=+1061.703557074" observedRunningTime="2025-10-02 16:35:03.240313038 +0000 UTC m=+1061.989542575" watchObservedRunningTime="2025-10-02 16:35:03.244921922 +0000 UTC m=+1061.994151449" Oct 02 16:35:04 crc kubenswrapper[4882]: I1002 16:35:04.254866 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-27tjx" podStartSLOduration=3.4874588859999998 podStartE2EDuration="25.25484791s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:41.193756042 +0000 UTC m=+1039.942985569" lastFinishedPulling="2025-10-02 16:35:02.961145066 +0000 UTC m=+1061.710374593" observedRunningTime="2025-10-02 16:35:04.252313077 +0000 UTC m=+1063.001542644" watchObservedRunningTime="2025-10-02 16:35:04.25484791 +0000 UTC m=+1063.004077447" Oct 02 16:35:04 crc kubenswrapper[4882]: I1002 16:35:04.277635 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" podStartSLOduration=3.50253757 podStartE2EDuration="25.277607754s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:41.180690688 +0000 UTC m=+1039.929920215" lastFinishedPulling="2025-10-02 16:35:02.955760872 +0000 UTC m=+1061.704990399" observedRunningTime="2025-10-02 16:35:04.269060502 +0000 UTC m=+1063.018290029" watchObservedRunningTime="2025-10-02 16:35:04.277607754 +0000 UTC m=+1063.026837301" Oct 02 16:35:08 crc kubenswrapper[4882]: I1002 16:35:08.269848 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" event={"ID":"8f5b9997-c12c-4ba9-9efe-649f9d03e52e","Type":"ContainerStarted","Data":"0b76475f416feb4cdeedb43d676713532a26fba579da605bb57b88e2006c5e60"} Oct 02 16:35:08 crc kubenswrapper[4882]: I1002 16:35:08.270349 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" Oct 02 16:35:08 crc kubenswrapper[4882]: I1002 16:35:08.292903 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" podStartSLOduration=2.705414382 podStartE2EDuration="29.29287534s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.988273483 +0000 UTC m=+1039.737503010" lastFinishedPulling="2025-10-02 16:35:07.575734441 +0000 UTC m=+1066.324963968" observedRunningTime="2025-10-02 16:35:08.288024309 +0000 UTC m=+1067.037253836" watchObservedRunningTime="2025-10-02 16:35:08.29287534 +0000 UTC m=+1067.042104867" Oct 02 16:35:09 crc kubenswrapper[4882]: I1002 16:35:09.797063 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-dfckg" Oct 02 16:35:09 crc kubenswrapper[4882]: I1002 16:35:09.826445 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" Oct 02 16:35:09 crc kubenswrapper[4882]: I1002 16:35:09.831626 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-tqmjh" Oct 02 16:35:10 crc kubenswrapper[4882]: I1002 16:35:10.036416 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:35:10 crc kubenswrapper[4882]: I1002 16:35:10.041656 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-lgpvs" Oct 02 16:35:10 crc kubenswrapper[4882]: I1002 16:35:10.085860 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-twxqz" Oct 02 16:35:10 crc kubenswrapper[4882]: I1002 16:35:10.144096 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-5sm69" Oct 02 16:35:10 crc kubenswrapper[4882]: I1002 16:35:10.420900 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7774cdf765ssbw4" Oct 02 16:35:13 crc kubenswrapper[4882]: I1002 16:35:13.313859 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" event={"ID":"cd42cabb-2afc-45b4-aecd-9d97980e0840","Type":"ContainerStarted","Data":"de5e6b782e70d463239aedf4635c77be91e9ff62b13493ce0c41e6107a6bdd34"} Oct 02 16:35:13 crc kubenswrapper[4882]: I1002 16:35:13.316142 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" Oct 02 16:35:13 crc kubenswrapper[4882]: I1002 16:35:13.325318 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" event={"ID":"2c520177-6db1-41dd-808a-49f676f9870c","Type":"ContainerStarted","Data":"51cbd88e49eb975102b2620609cd7ccd953e6ce24552b214ce533e8d72942c03"} Oct 02 16:35:13 crc kubenswrapper[4882]: I1002 16:35:13.326069 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" Oct 02 16:35:13 crc kubenswrapper[4882]: I1002 16:35:13.347828 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" podStartSLOduration=2.621612334 podStartE2EDuration="35.34780593s" podCreationTimestamp="2025-10-02 16:34:38 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.195612596 +0000 UTC m=+1038.944842123" lastFinishedPulling="2025-10-02 16:35:12.921806192 +0000 UTC m=+1071.671035719" observedRunningTime="2025-10-02 16:35:13.337846144 +0000 UTC m=+1072.087075671" watchObservedRunningTime="2025-10-02 16:35:13.34780593 +0000 UTC m=+1072.097035447" Oct 02 16:35:13 crc kubenswrapper[4882]: I1002 16:35:13.365145 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" podStartSLOduration=2.34959057 podStartE2EDuration="34.365116668s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.906248553 +0000 UTC m=+1039.655478080" lastFinishedPulling="2025-10-02 16:35:12.921774651 +0000 UTC m=+1071.671004178" observedRunningTime="2025-10-02 16:35:13.361621232 +0000 UTC m=+1072.110850779" watchObservedRunningTime="2025-10-02 16:35:13.365116668 +0000 UTC m=+1072.114346195" Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.333732 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" event={"ID":"cdf5d013-2e3a-41f5-ad36-c870b219a572","Type":"ContainerStarted","Data":"b2072ec23a9ed930a2e5e7379687354e68e2cef883cbb478a3bf578ad8945e8d"} Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.333928 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.335913 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" event={"ID":"d32146d6-c9aa-4864-bab5-71beebc6c6ef","Type":"ContainerStarted","Data":"d514937ad9041d0cf9b6f5c95e6a5c43b87dacd322e908e0609ec931ee7cf01d"} Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.336288 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.338652 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" event={"ID":"a430ae1a-9023-4c9d-a4ee-ed474e3bbd88","Type":"ContainerStarted","Data":"f3069186cee071e9c2c61cdacff5a4a615213480a334480a27894f86e0edb325"} Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.338934 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.340306 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" event={"ID":"131678a2-c80e-4990-85ef-9f8ed80cb4da","Type":"ContainerStarted","Data":"41ee353845e42daed61ffbe15ae6062da5b079158ed13b2a9729c5476428e532"} Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.340690 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.350890 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" podStartSLOduration=3.578909169 podStartE2EDuration="36.350875328s" podCreationTimestamp="2025-10-02 16:34:38 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.251816407 +0000 UTC m=+1039.001045934" lastFinishedPulling="2025-10-02 16:35:13.023782566 +0000 UTC m=+1071.773012093" observedRunningTime="2025-10-02 16:35:14.350360915 +0000 UTC m=+1073.099590462" watchObservedRunningTime="2025-10-02 16:35:14.350875328 +0000 UTC m=+1073.100104855" Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.372829 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" podStartSLOduration=3.040062647 podStartE2EDuration="35.372800401s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.60329152 +0000 UTC m=+1039.352521037" lastFinishedPulling="2025-10-02 16:35:12.936029254 +0000 UTC m=+1071.685258791" observedRunningTime="2025-10-02 16:35:14.364035154 +0000 UTC m=+1073.113264681" watchObservedRunningTime="2025-10-02 16:35:14.372800401 +0000 UTC m=+1073.122029928" Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.381380 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" podStartSLOduration=3.592164757 podStartE2EDuration="36.381349442s" podCreationTimestamp="2025-10-02 16:34:38 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.146350766 +0000 UTC m=+1038.895580293" lastFinishedPulling="2025-10-02 16:35:12.935535441 +0000 UTC m=+1071.684764978" observedRunningTime="2025-10-02 16:35:14.380980163 +0000 UTC m=+1073.130209690" watchObservedRunningTime="2025-10-02 16:35:14.381349442 +0000 UTC m=+1073.130578969" Oct 02 16:35:14 crc kubenswrapper[4882]: I1002 16:35:14.403547 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" podStartSLOduration=3.357274593 podStartE2EDuration="35.403522692s" podCreationTimestamp="2025-10-02 16:34:39 +0000 UTC" firstStartedPulling="2025-10-02 16:34:40.989791541 +0000 UTC m=+1039.739021068" lastFinishedPulling="2025-10-02 16:35:13.03603964 +0000 UTC m=+1071.785269167" observedRunningTime="2025-10-02 16:35:14.399736848 +0000 UTC m=+1073.148966445" watchObservedRunningTime="2025-10-02 16:35:14.403522692 +0000 UTC m=+1073.152752219" Oct 02 16:35:19 crc kubenswrapper[4882]: I1002 16:35:19.306970 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-mx2hz" Oct 02 16:35:19 crc kubenswrapper[4882]: I1002 16:35:19.358973 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-nwf78" Oct 02 16:35:19 crc kubenswrapper[4882]: I1002 16:35:19.398626 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-2sngh" Oct 02 16:35:19 crc kubenswrapper[4882]: I1002 16:35:19.455978 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-krzwg" Oct 02 16:35:19 crc kubenswrapper[4882]: I1002 16:35:19.592559 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-bhztc" Oct 02 16:35:19 crc kubenswrapper[4882]: I1002 16:35:19.993126 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-76d5577b-25zdk" Oct 02 16:35:20 crc kubenswrapper[4882]: I1002 16:35:20.027330 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-qmnwx" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.191744 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-lc5k4"] Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.193731 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.195697 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.196370 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.196906 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.196959 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kf4fm" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.203193 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-lc5k4"] Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.208413 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d73c67b-56b6-42f3-a7a2-721a38855a42-config\") pod \"dnsmasq-dns-6d84845cb9-lc5k4\" (UID: \"0d73c67b-56b6-42f3-a7a2-721a38855a42\") " pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.208518 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5cbl\" (UniqueName: \"kubernetes.io/projected/0d73c67b-56b6-42f3-a7a2-721a38855a42-kube-api-access-b5cbl\") pod \"dnsmasq-dns-6d84845cb9-lc5k4\" (UID: \"0d73c67b-56b6-42f3-a7a2-721a38855a42\") " pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.261612 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-q8g6z"] Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.262748 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.267268 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.276416 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-q8g6z"] Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.309754 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5cbl\" (UniqueName: \"kubernetes.io/projected/0d73c67b-56b6-42f3-a7a2-721a38855a42-kube-api-access-b5cbl\") pod \"dnsmasq-dns-6d84845cb9-lc5k4\" (UID: \"0d73c67b-56b6-42f3-a7a2-721a38855a42\") " pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.309818 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.309858 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d73c67b-56b6-42f3-a7a2-721a38855a42-config\") pod \"dnsmasq-dns-6d84845cb9-lc5k4\" (UID: \"0d73c67b-56b6-42f3-a7a2-721a38855a42\") " pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.309889 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-config\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.309970 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wgp\" (UniqueName: \"kubernetes.io/projected/298aa8d3-d8e9-4315-98c3-4f081b36028c-kube-api-access-t5wgp\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.311267 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d73c67b-56b6-42f3-a7a2-721a38855a42-config\") pod \"dnsmasq-dns-6d84845cb9-lc5k4\" (UID: \"0d73c67b-56b6-42f3-a7a2-721a38855a42\") " pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.330745 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5cbl\" (UniqueName: \"kubernetes.io/projected/0d73c67b-56b6-42f3-a7a2-721a38855a42-kube-api-access-b5cbl\") pod \"dnsmasq-dns-6d84845cb9-lc5k4\" (UID: \"0d73c67b-56b6-42f3-a7a2-721a38855a42\") " pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.411523 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wgp\" (UniqueName: \"kubernetes.io/projected/298aa8d3-d8e9-4315-98c3-4f081b36028c-kube-api-access-t5wgp\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.411602 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.411660 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-config\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.412515 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.412641 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-config\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.427952 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wgp\" (UniqueName: \"kubernetes.io/projected/298aa8d3-d8e9-4315-98c3-4f081b36028c-kube-api-access-t5wgp\") pod \"dnsmasq-dns-8687b65d7f-q8g6z\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.511770 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:34 crc kubenswrapper[4882]: I1002 16:35:34.579082 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:35 crc kubenswrapper[4882]: I1002 16:35:35.011991 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-lc5k4"] Oct 02 16:35:35 crc kubenswrapper[4882]: W1002 16:35:35.016184 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d73c67b_56b6_42f3_a7a2_721a38855a42.slice/crio-ac8eb892cf878dc01bfebafd45d4f0496b012066f8bf2661a96a6651af3d0db5 WatchSource:0}: Error finding container ac8eb892cf878dc01bfebafd45d4f0496b012066f8bf2661a96a6651af3d0db5: Status 404 returned error can't find the container with id ac8eb892cf878dc01bfebafd45d4f0496b012066f8bf2661a96a6651af3d0db5 Oct 02 16:35:35 crc kubenswrapper[4882]: I1002 16:35:35.089439 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-q8g6z"] Oct 02 16:35:35 crc kubenswrapper[4882]: W1002 16:35:35.096170 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod298aa8d3_d8e9_4315_98c3_4f081b36028c.slice/crio-4cf81ca96756f34409741f9e4b8d621e2bebcce4bf527ed33d082dd8c71b79f5 WatchSource:0}: Error finding container 4cf81ca96756f34409741f9e4b8d621e2bebcce4bf527ed33d082dd8c71b79f5: Status 404 returned error can't find the container with id 4cf81ca96756f34409741f9e4b8d621e2bebcce4bf527ed33d082dd8c71b79f5 Oct 02 16:35:35 crc kubenswrapper[4882]: I1002 16:35:35.502406 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" event={"ID":"0d73c67b-56b6-42f3-a7a2-721a38855a42","Type":"ContainerStarted","Data":"ac8eb892cf878dc01bfebafd45d4f0496b012066f8bf2661a96a6651af3d0db5"} Oct 02 16:35:35 crc kubenswrapper[4882]: I1002 16:35:35.504652 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" event={"ID":"298aa8d3-d8e9-4315-98c3-4f081b36028c","Type":"ContainerStarted","Data":"4cf81ca96756f34409741f9e4b8d621e2bebcce4bf527ed33d082dd8c71b79f5"} Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.001149 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-lc5k4"] Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.015959 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ctxjs"] Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.025265 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.035596 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ctxjs"] Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.049751 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhjc8\" (UniqueName: \"kubernetes.io/projected/4d8585fc-95e8-43cd-8e92-a659be92cee6-kube-api-access-fhjc8\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.050130 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-dns-svc\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.050268 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-config\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.152907 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-dns-svc\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.153244 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-config\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.153297 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhjc8\" (UniqueName: \"kubernetes.io/projected/4d8585fc-95e8-43cd-8e92-a659be92cee6-kube-api-access-fhjc8\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.154411 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-dns-svc\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.157179 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-config\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.206493 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhjc8\" (UniqueName: \"kubernetes.io/projected/4d8585fc-95e8-43cd-8e92-a659be92cee6-kube-api-access-fhjc8\") pod \"dnsmasq-dns-b599c6fc9-ctxjs\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.371359 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.416323 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-q8g6z"] Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.488772 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-2kf5r"] Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.490394 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.509916 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-2kf5r"] Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.676351 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj42z\" (UniqueName: \"kubernetes.io/projected/9211fb00-17ac-4591-9058-6de64e3ef7ba-kube-api-access-fj42z\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.676733 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-dns-svc\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.676757 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-config\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.777962 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-dns-svc\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.778015 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-config\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.778058 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj42z\" (UniqueName: \"kubernetes.io/projected/9211fb00-17ac-4591-9058-6de64e3ef7ba-kube-api-access-fj42z\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.779058 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-config\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.779775 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-dns-svc\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.801983 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj42z\" (UniqueName: \"kubernetes.io/projected/9211fb00-17ac-4591-9058-6de64e3ef7ba-kube-api-access-fj42z\") pod \"dnsmasq-dns-5cb7995759-2kf5r\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.843427 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:35:37 crc kubenswrapper[4882]: I1002 16:35:37.985918 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ctxjs"] Oct 02 16:35:38 crc kubenswrapper[4882]: W1002 16:35:38.011940 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d8585fc_95e8_43cd_8e92_a659be92cee6.slice/crio-64cc60dd72f5cbd7a27b0603c3cefbf28093061951e63685ea6e5319b9f90f85 WatchSource:0}: Error finding container 64cc60dd72f5cbd7a27b0603c3cefbf28093061951e63685ea6e5319b9f90f85: Status 404 returned error can't find the container with id 64cc60dd72f5cbd7a27b0603c3cefbf28093061951e63685ea6e5319b9f90f85 Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.188407 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.190438 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.192892 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qhjfm" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.192947 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.192964 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.192955 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.193043 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.193074 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.193155 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.207004 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.294787 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-2kf5r"] Oct 02 16:35:38 crc kubenswrapper[4882]: W1002 16:35:38.309057 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9211fb00_17ac_4591_9058_6de64e3ef7ba.slice/crio-c3b0c6868d7d795ecd941a5c19e8d6c7de14510ddfea102a30f113a684b1fe16 WatchSource:0}: Error finding container c3b0c6868d7d795ecd941a5c19e8d6c7de14510ddfea102a30f113a684b1fe16: Status 404 returned error can't find the container with id c3b0c6868d7d795ecd941a5c19e8d6c7de14510ddfea102a30f113a684b1fe16 Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386109 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386203 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386279 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386319 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad4f3fde-e95f-404d-baac-1c6238494afa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386377 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386582 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386725 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556fs\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-kube-api-access-556fs\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386800 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.386855 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad4f3fde-e95f-404d-baac-1c6238494afa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.387045 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.387110 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488122 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488184 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488248 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556fs\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-kube-api-access-556fs\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488271 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488291 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad4f3fde-e95f-404d-baac-1c6238494afa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488315 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488341 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488383 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488419 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488461 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.488488 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad4f3fde-e95f-404d-baac-1c6238494afa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.489399 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.489507 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.489750 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.489949 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.489752 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.534398 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.548576 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" event={"ID":"9211fb00-17ac-4591-9058-6de64e3ef7ba","Type":"ContainerStarted","Data":"c3b0c6868d7d795ecd941a5c19e8d6c7de14510ddfea102a30f113a684b1fe16"} Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.549850 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" event={"ID":"4d8585fc-95e8-43cd-8e92-a659be92cee6","Type":"ContainerStarted","Data":"64cc60dd72f5cbd7a27b0603c3cefbf28093061951e63685ea6e5319b9f90f85"} Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.612995 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.614038 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.615197 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.616701 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.617695 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.617887 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2f2ps" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.619152 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.619335 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556fs\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-kube-api-access-556fs\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.619439 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.619611 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.618167 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad4f3fde-e95f-404d-baac-1c6238494afa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.626411 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad4f3fde-e95f-404d-baac-1c6238494afa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.629538 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.629586 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.629684 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.636457 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.792968 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793366 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793393 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793454 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793521 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793547 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793577 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793627 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793657 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kstzg\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-kube-api-access-kstzg\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793706 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.793727 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.874695 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895553 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895622 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895657 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895693 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kstzg\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-kube-api-access-kstzg\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895722 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895756 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895815 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895841 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895864 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895915 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.895986 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.896899 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.897842 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.897863 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.899729 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.901613 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.901908 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.903499 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.905607 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.908612 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.909139 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.918440 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kstzg\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-kube-api-access-kstzg\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.928397 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " pod="openstack/rabbitmq-server-0" Oct 02 16:35:38 crc kubenswrapper[4882]: I1002 16:35:38.943008 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.345110 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.542732 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 16:35:39 crc kubenswrapper[4882]: W1002 16:35:39.552830 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a0ea19_b66f_4dc9_95a5_b6dd8fc3eb42.slice/crio-27e111b66dcfa7e5cd3f137f243ac139b62161da38921f13664b42ce7030ba60 WatchSource:0}: Error finding container 27e111b66dcfa7e5cd3f137f243ac139b62161da38921f13664b42ce7030ba60: Status 404 returned error can't find the container with id 27e111b66dcfa7e5cd3f137f243ac139b62161da38921f13664b42ce7030ba60 Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.560116 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad4f3fde-e95f-404d-baac-1c6238494afa","Type":"ContainerStarted","Data":"9511454e831a71a2a3e763a37979a48a0a41b701f447b13d0e58e793d20ded8c"} Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.753847 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.755387 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.759951 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.762333 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zwqvh" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.764796 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.764883 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.764804 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.767418 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.772856 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.924816 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.924913 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-kolla-config\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.925017 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hwpn\" (UniqueName: \"kubernetes.io/projected/fe1004bf-948f-4aae-b19b-a1321eab3b03-kube-api-access-8hwpn\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.925044 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.925070 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.925159 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-secrets\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.925271 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.925391 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:39 crc kubenswrapper[4882]: I1002 16:35:39.925539 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-default\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.026896 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-kolla-config\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.026997 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hwpn\" (UniqueName: \"kubernetes.io/projected/fe1004bf-948f-4aae-b19b-a1321eab3b03-kube-api-access-8hwpn\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027025 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027054 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027080 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-secrets\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027109 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027152 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027204 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-default\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027256 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027651 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.027798 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.029436 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-kolla-config\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.030593 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-default\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.030901 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.077532 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.077586 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.077669 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-secrets\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.090427 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.091657 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hwpn\" (UniqueName: \"kubernetes.io/projected/fe1004bf-948f-4aae-b19b-a1321eab3b03-kube-api-access-8hwpn\") pod \"openstack-galera-0\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.387609 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.574508 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42","Type":"ContainerStarted","Data":"27e111b66dcfa7e5cd3f137f243ac139b62161da38921f13664b42ce7030ba60"} Oct 02 16:35:40 crc kubenswrapper[4882]: I1002 16:35:40.896674 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.069541 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.071499 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.076019 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-49zhs" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.076269 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.077956 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.078258 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.087955 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.149574 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.149692 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.149728 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.149754 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.149773 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.149813 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.150065 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sb95\" (UniqueName: \"kubernetes.io/projected/b156c0c6-4395-4609-8260-5ee8943d6813-kube-api-access-9sb95\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.150202 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.150286 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.252296 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sb95\" (UniqueName: \"kubernetes.io/projected/b156c0c6-4395-4609-8260-5ee8943d6813-kube-api-access-9sb95\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.252351 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.252379 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.252416 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.252485 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.254106 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.254138 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.254158 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.254200 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.258606 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.259150 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.260842 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.261485 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.261951 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.262162 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.265242 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.265354 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.275253 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.280943 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.285665 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.285934 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.285976 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.286144 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-sk55k" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.288835 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sb95\" (UniqueName: \"kubernetes.io/projected/b156c0c6-4395-4609-8260-5ee8943d6813-kube-api-access-9sb95\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.334600 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.355964 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.356019 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg57d\" (UniqueName: \"kubernetes.io/projected/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kube-api-access-bg57d\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.356064 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.356084 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kolla-config\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.356267 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-config-data\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.409657 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.461750 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.461877 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg57d\" (UniqueName: \"kubernetes.io/projected/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kube-api-access-bg57d\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.461965 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.461994 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kolla-config\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.462060 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-config-data\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.463098 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kolla-config\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.463203 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-config-data\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.481082 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.481202 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.486885 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg57d\" (UniqueName: \"kubernetes.io/projected/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kube-api-access-bg57d\") pod \"memcached-0\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " pod="openstack/memcached-0" Oct 02 16:35:41 crc kubenswrapper[4882]: I1002 16:35:41.688776 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 16:35:42 crc kubenswrapper[4882]: I1002 16:35:42.941967 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:35:42 crc kubenswrapper[4882]: I1002 16:35:42.943521 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 16:35:42 crc kubenswrapper[4882]: I1002 16:35:42.948629 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-d5fds" Oct 02 16:35:42 crc kubenswrapper[4882]: I1002 16:35:42.950237 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:35:43 crc kubenswrapper[4882]: I1002 16:35:43.030105 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlv5n\" (UniqueName: \"kubernetes.io/projected/5d019714-550c-4da5-9d79-8bd03c1cb2f6-kube-api-access-nlv5n\") pod \"kube-state-metrics-0\" (UID: \"5d019714-550c-4da5-9d79-8bd03c1cb2f6\") " pod="openstack/kube-state-metrics-0" Oct 02 16:35:43 crc kubenswrapper[4882]: I1002 16:35:43.132544 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlv5n\" (UniqueName: \"kubernetes.io/projected/5d019714-550c-4da5-9d79-8bd03c1cb2f6-kube-api-access-nlv5n\") pod \"kube-state-metrics-0\" (UID: \"5d019714-550c-4da5-9d79-8bd03c1cb2f6\") " pod="openstack/kube-state-metrics-0" Oct 02 16:35:43 crc kubenswrapper[4882]: I1002 16:35:43.155994 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlv5n\" (UniqueName: \"kubernetes.io/projected/5d019714-550c-4da5-9d79-8bd03c1cb2f6-kube-api-access-nlv5n\") pod \"kube-state-metrics-0\" (UID: \"5d019714-550c-4da5-9d79-8bd03c1cb2f6\") " pod="openstack/kube-state-metrics-0" Oct 02 16:35:43 crc kubenswrapper[4882]: I1002 16:35:43.289141 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 16:35:45 crc kubenswrapper[4882]: W1002 16:35:45.857817 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe1004bf_948f_4aae_b19b_a1321eab3b03.slice/crio-baf46152ef0a6b3a70cbdb5edf7d9d04c22b36892b776a0c61643b1e1e801dc9 WatchSource:0}: Error finding container baf46152ef0a6b3a70cbdb5edf7d9d04c22b36892b776a0c61643b1e1e801dc9: Status 404 returned error can't find the container with id baf46152ef0a6b3a70cbdb5edf7d9d04c22b36892b776a0c61643b1e1e801dc9 Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.187654 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v7dcx"] Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.189170 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.191989 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-s4gm5" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.192257 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.196146 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.200591 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v7dcx"] Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.250333 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-wsqlw"] Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.252533 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.256467 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wsqlw"] Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.298012 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-ovn-controller-tls-certs\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.298076 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-scripts\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.298130 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-log-ovn\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.298170 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nclpr\" (UniqueName: \"kubernetes.io/projected/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-kube-api-access-nclpr\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.298197 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-combined-ca-bundle\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.298358 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.298403 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run-ovn\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400274 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-etc-ovs\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400362 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400406 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run-ovn\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400456 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-run\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400506 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-ovn-controller-tls-certs\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400553 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-scripts\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400595 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-lib\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400619 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-log-ovn\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400645 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-log\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400668 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m94dk\" (UniqueName: \"kubernetes.io/projected/21667760-8ee1-456b-af11-a501cdf77822-kube-api-access-m94dk\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400695 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nclpr\" (UniqueName: \"kubernetes.io/projected/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-kube-api-access-nclpr\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400719 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21667760-8ee1-456b-af11-a501cdf77822-scripts\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.400743 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-combined-ca-bundle\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.401155 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.401324 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run-ovn\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.402202 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-log-ovn\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.404590 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-scripts\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.408730 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-ovn-controller-tls-certs\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.408831 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-combined-ca-bundle\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.420440 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nclpr\" (UniqueName: \"kubernetes.io/projected/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-kube-api-access-nclpr\") pod \"ovn-controller-v7dcx\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.502390 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-lib\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.502479 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-log\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.502520 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m94dk\" (UniqueName: \"kubernetes.io/projected/21667760-8ee1-456b-af11-a501cdf77822-kube-api-access-m94dk\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.502551 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21667760-8ee1-456b-af11-a501cdf77822-scripts\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.502638 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-etc-ovs\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.502685 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-run\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.502883 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-run\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.503103 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-lib\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.503195 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-log\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.505630 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21667760-8ee1-456b-af11-a501cdf77822-scripts\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.505795 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-etc-ovs\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.519742 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.520505 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m94dk\" (UniqueName: \"kubernetes.io/projected/21667760-8ee1-456b-af11-a501cdf77822-kube-api-access-m94dk\") pod \"ovn-controller-ovs-wsqlw\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.570556 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:35:46 crc kubenswrapper[4882]: I1002 16:35:46.693760 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fe1004bf-948f-4aae-b19b-a1321eab3b03","Type":"ContainerStarted","Data":"baf46152ef0a6b3a70cbdb5edf7d9d04c22b36892b776a0c61643b1e1e801dc9"} Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.118099 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.119751 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.123657 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.123767 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.123913 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.124293 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-vkcx2" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.126239 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.149878 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.251053 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.251365 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.251395 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.251425 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.251466 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.251533 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87l2b\" (UniqueName: \"kubernetes.io/projected/142d02f0-5616-42b6-b6fc-b37df2639f8a-kube-api-access-87l2b\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.251579 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.251608 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353368 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353429 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353451 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353482 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353515 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353552 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353575 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87l2b\" (UniqueName: \"kubernetes.io/projected/142d02f0-5616-42b6-b6fc-b37df2639f8a-kube-api-access-87l2b\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353610 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.353755 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.354958 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.355010 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.355261 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.359395 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.360079 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.360471 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.375326 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87l2b\" (UniqueName: \"kubernetes.io/projected/142d02f0-5616-42b6-b6fc-b37df2639f8a-kube-api-access-87l2b\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.380444 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:49 crc kubenswrapper[4882]: I1002 16:35:49.456891 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.496190 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.498443 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.501849 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.502069 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.502380 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qpsd9" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.505437 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.511833 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.572547 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.572929 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.572995 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-config\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.573017 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.573128 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.573179 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.573226 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzg6d\" (UniqueName: \"kubernetes.io/projected/7132edf6-8a37-4230-a3a5-4703be721a78-kube-api-access-bzg6d\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.573346 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.675383 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-config\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.675435 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.675474 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.675507 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.675534 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzg6d\" (UniqueName: \"kubernetes.io/projected/7132edf6-8a37-4230-a3a5-4703be721a78-kube-api-access-bzg6d\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.675581 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.675633 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.675669 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.676124 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.676264 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.676342 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-config\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.676642 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.682266 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.688088 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.688675 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.693298 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzg6d\" (UniqueName: \"kubernetes.io/projected/7132edf6-8a37-4230-a3a5-4703be721a78-kube-api-access-bzg6d\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.703832 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:50 crc kubenswrapper[4882]: I1002 16:35:50.826580 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 16:35:54 crc kubenswrapper[4882]: E1002 16:35:54.240863 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 02 16:35:54 crc kubenswrapper[4882]: E1002 16:35:54.241253 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t5wgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8687b65d7f-q8g6z_openstack(298aa8d3-d8e9-4315-98c3-4f081b36028c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 16:35:54 crc kubenswrapper[4882]: E1002 16:35:54.242430 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" podUID="298aa8d3-d8e9-4315-98c3-4f081b36028c" Oct 02 16:35:54 crc kubenswrapper[4882]: E1002 16:35:54.399711 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 02 16:35:54 crc kubenswrapper[4882]: E1002 16:35:54.399993 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b5cbl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6d84845cb9-lc5k4_openstack(0d73c67b-56b6-42f3-a7a2-721a38855a42): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 16:35:54 crc kubenswrapper[4882]: E1002 16:35:54.401261 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" podUID="0d73c67b-56b6-42f3-a7a2-721a38855a42" Oct 02 16:35:54 crc kubenswrapper[4882]: I1002 16:35:54.708390 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 16:35:55 crc kubenswrapper[4882]: W1002 16:35:55.621800 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb156c0c6_4395_4609_8260_5ee8943d6813.slice/crio-9b0c2a27240f8fc32be35d9bfb98821d1b35755d06218c96ee1b3acf010a2692 WatchSource:0}: Error finding container 9b0c2a27240f8fc32be35d9bfb98821d1b35755d06218c96ee1b3acf010a2692: Status 404 returned error can't find the container with id 9b0c2a27240f8fc32be35d9bfb98821d1b35755d06218c96ee1b3acf010a2692 Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.701678 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.709603 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.758660 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5cbl\" (UniqueName: \"kubernetes.io/projected/0d73c67b-56b6-42f3-a7a2-721a38855a42-kube-api-access-b5cbl\") pod \"0d73c67b-56b6-42f3-a7a2-721a38855a42\" (UID: \"0d73c67b-56b6-42f3-a7a2-721a38855a42\") " Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.758976 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-dns-svc\") pod \"298aa8d3-d8e9-4315-98c3-4f081b36028c\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.759030 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-config\") pod \"298aa8d3-d8e9-4315-98c3-4f081b36028c\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.759126 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d73c67b-56b6-42f3-a7a2-721a38855a42-config\") pod \"0d73c67b-56b6-42f3-a7a2-721a38855a42\" (UID: \"0d73c67b-56b6-42f3-a7a2-721a38855a42\") " Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.759152 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5wgp\" (UniqueName: \"kubernetes.io/projected/298aa8d3-d8e9-4315-98c3-4f081b36028c-kube-api-access-t5wgp\") pod \"298aa8d3-d8e9-4315-98c3-4f081b36028c\" (UID: \"298aa8d3-d8e9-4315-98c3-4f081b36028c\") " Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.760762 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "298aa8d3-d8e9-4315-98c3-4f081b36028c" (UID: "298aa8d3-d8e9-4315-98c3-4f081b36028c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.776663 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-config" (OuterVolumeSpecName: "config") pod "298aa8d3-d8e9-4315-98c3-4f081b36028c" (UID: "298aa8d3-d8e9-4315-98c3-4f081b36028c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.777177 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d73c67b-56b6-42f3-a7a2-721a38855a42-config" (OuterVolumeSpecName: "config") pod "0d73c67b-56b6-42f3-a7a2-721a38855a42" (UID: "0d73c67b-56b6-42f3-a7a2-721a38855a42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.780969 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/298aa8d3-d8e9-4315-98c3-4f081b36028c-kube-api-access-t5wgp" (OuterVolumeSpecName: "kube-api-access-t5wgp") pod "298aa8d3-d8e9-4315-98c3-4f081b36028c" (UID: "298aa8d3-d8e9-4315-98c3-4f081b36028c"). InnerVolumeSpecName "kube-api-access-t5wgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.785403 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" event={"ID":"0d73c67b-56b6-42f3-a7a2-721a38855a42","Type":"ContainerDied","Data":"ac8eb892cf878dc01bfebafd45d4f0496b012066f8bf2661a96a6651af3d0db5"} Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.785473 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-lc5k4" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.786624 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d73c67b-56b6-42f3-a7a2-721a38855a42-kube-api-access-b5cbl" (OuterVolumeSpecName: "kube-api-access-b5cbl") pod "0d73c67b-56b6-42f3-a7a2-721a38855a42" (UID: "0d73c67b-56b6-42f3-a7a2-721a38855a42"). InnerVolumeSpecName "kube-api-access-b5cbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.787012 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b156c0c6-4395-4609-8260-5ee8943d6813","Type":"ContainerStarted","Data":"9b0c2a27240f8fc32be35d9bfb98821d1b35755d06218c96ee1b3acf010a2692"} Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.788734 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" event={"ID":"298aa8d3-d8e9-4315-98c3-4f081b36028c","Type":"ContainerDied","Data":"4cf81ca96756f34409741f9e4b8d621e2bebcce4bf527ed33d082dd8c71b79f5"} Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.788798 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-q8g6z" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.857477 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-q8g6z"] Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.861246 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5cbl\" (UniqueName: \"kubernetes.io/projected/0d73c67b-56b6-42f3-a7a2-721a38855a42-kube-api-access-b5cbl\") on node \"crc\" DevicePath \"\"" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.861277 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.861287 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298aa8d3-d8e9-4315-98c3-4f081b36028c-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.861295 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d73c67b-56b6-42f3-a7a2-721a38855a42-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.861304 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5wgp\" (UniqueName: \"kubernetes.io/projected/298aa8d3-d8e9-4315-98c3-4f081b36028c-kube-api-access-t5wgp\") on node \"crc\" DevicePath \"\"" Oct 02 16:35:55 crc kubenswrapper[4882]: I1002 16:35:55.865559 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-q8g6z"] Oct 02 16:35:56 crc kubenswrapper[4882]: I1002 16:35:56.036128 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:35:56 crc kubenswrapper[4882]: I1002 16:35:56.142439 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-lc5k4"] Oct 02 16:35:56 crc kubenswrapper[4882]: I1002 16:35:56.156979 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-lc5k4"] Oct 02 16:35:56 crc kubenswrapper[4882]: I1002 16:35:56.239674 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 16:35:56 crc kubenswrapper[4882]: I1002 16:35:56.771024 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d73c67b-56b6-42f3-a7a2-721a38855a42" path="/var/lib/kubelet/pods/0d73c67b-56b6-42f3-a7a2-721a38855a42/volumes" Oct 02 16:35:56 crc kubenswrapper[4882]: I1002 16:35:56.771597 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="298aa8d3-d8e9-4315-98c3-4f081b36028c" path="/var/lib/kubelet/pods/298aa8d3-d8e9-4315-98c3-4f081b36028c/volumes" Oct 02 16:35:57 crc kubenswrapper[4882]: W1002 16:35:57.745180 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod142d02f0_5616_42b6_b6fc_b37df2639f8a.slice/crio-3a79108d8ebc383aa3c81dcda7cd8fa5e1b04e9eb9f44aa52da49fde5e6197e4 WatchSource:0}: Error finding container 3a79108d8ebc383aa3c81dcda7cd8fa5e1b04e9eb9f44aa52da49fde5e6197e4: Status 404 returned error can't find the container with id 3a79108d8ebc383aa3c81dcda7cd8fa5e1b04e9eb9f44aa52da49fde5e6197e4 Oct 02 16:35:57 crc kubenswrapper[4882]: I1002 16:35:57.827658 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5d019714-550c-4da5-9d79-8bd03c1cb2f6","Type":"ContainerStarted","Data":"255016af16e080e283a42effac070fb40b9888758df49e3ee094f4efe07d1222"} Oct 02 16:35:57 crc kubenswrapper[4882]: I1002 16:35:57.839841 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"142d02f0-5616-42b6-b6fc-b37df2639f8a","Type":"ContainerStarted","Data":"3a79108d8ebc383aa3c81dcda7cd8fa5e1b04e9eb9f44aa52da49fde5e6197e4"} Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.291555 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.327627 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 16:35:58 crc kubenswrapper[4882]: W1002 16:35:58.337178 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ada43e_a36e_49c7_bc9e_6c3151d2eb6b.slice/crio-ffd18c2b01e6b06ae1a6a1265915fa190763d51c676bef284a24d42044e3b515 WatchSource:0}: Error finding container ffd18c2b01e6b06ae1a6a1265915fa190763d51c676bef284a24d42044e3b515: Status 404 returned error can't find the container with id ffd18c2b01e6b06ae1a6a1265915fa190763d51c676bef284a24d42044e3b515 Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.341309 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v7dcx"] Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.415074 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wsqlw"] Oct 02 16:35:58 crc kubenswrapper[4882]: W1002 16:35:58.492242 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21667760_8ee1_456b_af11_a501cdf77822.slice/crio-d79a25598f3a568bf528327c7efdea8d9bb84f3121c50ff5aa84475e9f96c441 WatchSource:0}: Error finding container d79a25598f3a568bf528327c7efdea8d9bb84f3121c50ff5aa84475e9f96c441: Status 404 returned error can't find the container with id d79a25598f3a568bf528327c7efdea8d9bb84f3121c50ff5aa84475e9f96c441 Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.849883 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" event={"ID":"4d8585fc-95e8-43cd-8e92-a659be92cee6","Type":"ContainerStarted","Data":"111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7"} Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.851560 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx" event={"ID":"e205184e-bcf8-498d-8a1a-bc1c8539c2ae","Type":"ContainerStarted","Data":"7e1a0b6bad817661a14e4dbd53c4e6d723d2a62695a9e41d24346f02e1a6c19d"} Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.852912 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b","Type":"ContainerStarted","Data":"ffd18c2b01e6b06ae1a6a1265915fa190763d51c676bef284a24d42044e3b515"} Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.853963 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7132edf6-8a37-4230-a3a5-4703be721a78","Type":"ContainerStarted","Data":"e2acd5cc3a0b8d34bc7e2567945a783842fddd34f7684be0d416c168d0faa381"} Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.854998 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wsqlw" event={"ID":"21667760-8ee1-456b-af11-a501cdf77822","Type":"ContainerStarted","Data":"d79a25598f3a568bf528327c7efdea8d9bb84f3121c50ff5aa84475e9f96c441"} Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.856404 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fe1004bf-948f-4aae-b19b-a1321eab3b03","Type":"ContainerStarted","Data":"97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854"} Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.858592 4882 generic.go:334] "Generic (PLEG): container finished" podID="9211fb00-17ac-4591-9058-6de64e3ef7ba" containerID="be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc" exitCode=0 Oct 02 16:35:58 crc kubenswrapper[4882]: I1002 16:35:58.858641 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" event={"ID":"9211fb00-17ac-4591-9058-6de64e3ef7ba","Type":"ContainerDied","Data":"be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc"} Oct 02 16:35:59 crc kubenswrapper[4882]: I1002 16:35:59.872175 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b156c0c6-4395-4609-8260-5ee8943d6813","Type":"ContainerStarted","Data":"29253f1351cb8c376d642b9c868a7dc63855121b01b0705c9179056a230e6ebf"} Oct 02 16:35:59 crc kubenswrapper[4882]: I1002 16:35:59.877050 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad4f3fde-e95f-404d-baac-1c6238494afa","Type":"ContainerStarted","Data":"0d67f1f0413558904549638569857532409de73c996c52a84ea4614ae2ae72a3"} Oct 02 16:35:59 crc kubenswrapper[4882]: I1002 16:35:59.880031 4882 generic.go:334] "Generic (PLEG): container finished" podID="4d8585fc-95e8-43cd-8e92-a659be92cee6" containerID="111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7" exitCode=0 Oct 02 16:35:59 crc kubenswrapper[4882]: I1002 16:35:59.880086 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" event={"ID":"4d8585fc-95e8-43cd-8e92-a659be92cee6","Type":"ContainerDied","Data":"111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7"} Oct 02 16:36:00 crc kubenswrapper[4882]: E1002 16:36:00.470378 4882 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 02 16:36:00 crc kubenswrapper[4882]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4d8585fc-95e8-43cd-8e92-a659be92cee6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 02 16:36:00 crc kubenswrapper[4882]: > podSandboxID="64cc60dd72f5cbd7a27b0603c3cefbf28093061951e63685ea6e5319b9f90f85" Oct 02 16:36:00 crc kubenswrapper[4882]: E1002 16:36:00.470921 4882 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 02 16:36:00 crc kubenswrapper[4882]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fhjc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b599c6fc9-ctxjs_openstack(4d8585fc-95e8-43cd-8e92-a659be92cee6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4d8585fc-95e8-43cd-8e92-a659be92cee6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 02 16:36:00 crc kubenswrapper[4882]: > logger="UnhandledError" Oct 02 16:36:00 crc kubenswrapper[4882]: E1002 16:36:00.472084 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4d8585fc-95e8-43cd-8e92-a659be92cee6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" podUID="4d8585fc-95e8-43cd-8e92-a659be92cee6" Oct 02 16:36:00 crc kubenswrapper[4882]: I1002 16:36:00.895306 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42","Type":"ContainerStarted","Data":"56f5e136c1bbeef542d74f85786e2eb58e6cd2e6742745adf39c24f6ed827f4d"} Oct 02 16:36:00 crc kubenswrapper[4882]: I1002 16:36:00.898042 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" event={"ID":"9211fb00-17ac-4591-9058-6de64e3ef7ba","Type":"ContainerStarted","Data":"d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8"} Oct 02 16:36:00 crc kubenswrapper[4882]: I1002 16:36:00.944427 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" podStartSLOduration=4.319596603 podStartE2EDuration="23.944400495s" podCreationTimestamp="2025-10-02 16:35:37 +0000 UTC" firstStartedPulling="2025-10-02 16:35:38.312815707 +0000 UTC m=+1097.062045234" lastFinishedPulling="2025-10-02 16:35:57.937619599 +0000 UTC m=+1116.686849126" observedRunningTime="2025-10-02 16:36:00.94184439 +0000 UTC m=+1119.691073917" watchObservedRunningTime="2025-10-02 16:36:00.944400495 +0000 UTC m=+1119.693630022" Oct 02 16:36:01 crc kubenswrapper[4882]: I1002 16:36:01.909668 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" event={"ID":"4d8585fc-95e8-43cd-8e92-a659be92cee6","Type":"ContainerStarted","Data":"13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e"} Oct 02 16:36:01 crc kubenswrapper[4882]: I1002 16:36:01.910079 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:36:01 crc kubenswrapper[4882]: I1002 16:36:01.910271 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:36:02 crc kubenswrapper[4882]: I1002 16:36:02.798651 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" podStartSLOduration=6.798959158 podStartE2EDuration="26.798629034s" podCreationTimestamp="2025-10-02 16:35:36 +0000 UTC" firstStartedPulling="2025-10-02 16:35:38.018278694 +0000 UTC m=+1096.767508221" lastFinishedPulling="2025-10-02 16:35:58.01794857 +0000 UTC m=+1116.767178097" observedRunningTime="2025-10-02 16:36:01.932289237 +0000 UTC m=+1120.681518764" watchObservedRunningTime="2025-10-02 16:36:02.798629034 +0000 UTC m=+1121.547858561" Oct 02 16:36:05 crc kubenswrapper[4882]: I1002 16:36:05.952424 4882 generic.go:334] "Generic (PLEG): container finished" podID="fe1004bf-948f-4aae-b19b-a1321eab3b03" containerID="97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854" exitCode=0 Oct 02 16:36:05 crc kubenswrapper[4882]: I1002 16:36:05.952522 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fe1004bf-948f-4aae-b19b-a1321eab3b03","Type":"ContainerDied","Data":"97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854"} Oct 02 16:36:05 crc kubenswrapper[4882]: I1002 16:36:05.957234 4882 generic.go:334] "Generic (PLEG): container finished" podID="b156c0c6-4395-4609-8260-5ee8943d6813" containerID="29253f1351cb8c376d642b9c868a7dc63855121b01b0705c9179056a230e6ebf" exitCode=0 Oct 02 16:36:05 crc kubenswrapper[4882]: I1002 16:36:05.957278 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b156c0c6-4395-4609-8260-5ee8943d6813","Type":"ContainerDied","Data":"29253f1351cb8c376d642b9c868a7dc63855121b01b0705c9179056a230e6ebf"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.965955 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fe1004bf-948f-4aae-b19b-a1321eab3b03","Type":"ContainerStarted","Data":"37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.967506 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx" event={"ID":"e205184e-bcf8-498d-8a1a-bc1c8539c2ae","Type":"ContainerStarted","Data":"1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.967643 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-v7dcx" Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.968716 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b","Type":"ContainerStarted","Data":"2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.968913 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.969913 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7132edf6-8a37-4230-a3a5-4703be721a78","Type":"ContainerStarted","Data":"3f058549c5a2bb644c18b7d33ab3953b22f72d0657caaa9bed2e0b3555256692"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.971150 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wsqlw" event={"ID":"21667760-8ee1-456b-af11-a501cdf77822","Type":"ContainerStarted","Data":"ddb5fdad9fc0ae4cf9f153e2b427d0512f3422d8375ee4e7f8e97b3474db9d80"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.973297 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b156c0c6-4395-4609-8260-5ee8943d6813","Type":"ContainerStarted","Data":"ce86d9d2458ffecb5f7269b0144634e7551d8db7396b16d935a1e65eea3b1726"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.975442 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"142d02f0-5616-42b6-b6fc-b37df2639f8a","Type":"ContainerStarted","Data":"cabd7eae8001349116cafdd29dd1df7fb3b6e49cd24abaa8c9cf5171da8dae76"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.976683 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5d019714-550c-4da5-9d79-8bd03c1cb2f6","Type":"ContainerStarted","Data":"fd3d1168a9052070845923b9e8dea6c765e24e4825cb3252fba286141fb23f1a"} Oct 02 16:36:06 crc kubenswrapper[4882]: I1002 16:36:06.976844 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.010313 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=16.856988234 podStartE2EDuration="29.010268249s" podCreationTimestamp="2025-10-02 16:35:38 +0000 UTC" firstStartedPulling="2025-10-02 16:35:45.862050179 +0000 UTC m=+1104.611279706" lastFinishedPulling="2025-10-02 16:35:58.015330194 +0000 UTC m=+1116.764559721" observedRunningTime="2025-10-02 16:36:06.990325925 +0000 UTC m=+1125.739555452" watchObservedRunningTime="2025-10-02 16:36:07.010268249 +0000 UTC m=+1125.759497776" Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.057831 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.342382534 podStartE2EDuration="25.057812791s" podCreationTimestamp="2025-10-02 16:35:42 +0000 UTC" firstStartedPulling="2025-10-02 16:35:57.947980951 +0000 UTC m=+1116.697210478" lastFinishedPulling="2025-10-02 16:36:06.663411208 +0000 UTC m=+1125.412640735" observedRunningTime="2025-10-02 16:36:07.054646122 +0000 UTC m=+1125.803875649" watchObservedRunningTime="2025-10-02 16:36:07.057812791 +0000 UTC m=+1125.807042318" Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.077720 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.727599384 podStartE2EDuration="26.077697464s" podCreationTimestamp="2025-10-02 16:35:41 +0000 UTC" firstStartedPulling="2025-10-02 16:35:58.345494253 +0000 UTC m=+1117.094723780" lastFinishedPulling="2025-10-02 16:36:05.695592323 +0000 UTC m=+1124.444821860" observedRunningTime="2025-10-02 16:36:07.071129589 +0000 UTC m=+1125.820359116" watchObservedRunningTime="2025-10-02 16:36:07.077697464 +0000 UTC m=+1125.826926991" Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.097366 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.506953705 podStartE2EDuration="27.097342332s" podCreationTimestamp="2025-10-02 16:35:40 +0000 UTC" firstStartedPulling="2025-10-02 16:35:55.644095559 +0000 UTC m=+1114.393325086" lastFinishedPulling="2025-10-02 16:35:58.234484176 +0000 UTC m=+1116.983713713" observedRunningTime="2025-10-02 16:36:07.090445707 +0000 UTC m=+1125.839675234" watchObservedRunningTime="2025-10-02 16:36:07.097342332 +0000 UTC m=+1125.846571849" Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.373972 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.393393 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v7dcx" podStartSLOduration=14.532194669999999 podStartE2EDuration="21.393370627s" podCreationTimestamp="2025-10-02 16:35:46 +0000 UTC" firstStartedPulling="2025-10-02 16:35:58.353836804 +0000 UTC m=+1117.103066331" lastFinishedPulling="2025-10-02 16:36:05.215012761 +0000 UTC m=+1123.964242288" observedRunningTime="2025-10-02 16:36:07.113180712 +0000 UTC m=+1125.862410249" watchObservedRunningTime="2025-10-02 16:36:07.393370627 +0000 UTC m=+1126.142600154" Oct 02 16:36:07 crc kubenswrapper[4882]: E1002 16:36:07.722667 4882 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21667760_8ee1_456b_af11_a501cdf77822.slice/crio-conmon-ddb5fdad9fc0ae4cf9f153e2b427d0512f3422d8375ee4e7f8e97b3474db9d80.scope\": RecentStats: unable to find data in memory cache]" Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.845371 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.908861 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ctxjs"] Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.989815 4882 generic.go:334] "Generic (PLEG): container finished" podID="21667760-8ee1-456b-af11-a501cdf77822" containerID="ddb5fdad9fc0ae4cf9f153e2b427d0512f3422d8375ee4e7f8e97b3474db9d80" exitCode=0 Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.990020 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wsqlw" event={"ID":"21667760-8ee1-456b-af11-a501cdf77822","Type":"ContainerDied","Data":"ddb5fdad9fc0ae4cf9f153e2b427d0512f3422d8375ee4e7f8e97b3474db9d80"} Oct 02 16:36:07 crc kubenswrapper[4882]: I1002 16:36:07.990752 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" podUID="4d8585fc-95e8-43cd-8e92-a659be92cee6" containerName="dnsmasq-dns" containerID="cri-o://13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e" gracePeriod=10 Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.487817 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.536446 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-dns-svc\") pod \"4d8585fc-95e8-43cd-8e92-a659be92cee6\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.536567 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-config\") pod \"4d8585fc-95e8-43cd-8e92-a659be92cee6\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.536725 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhjc8\" (UniqueName: \"kubernetes.io/projected/4d8585fc-95e8-43cd-8e92-a659be92cee6-kube-api-access-fhjc8\") pod \"4d8585fc-95e8-43cd-8e92-a659be92cee6\" (UID: \"4d8585fc-95e8-43cd-8e92-a659be92cee6\") " Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.548634 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8585fc-95e8-43cd-8e92-a659be92cee6-kube-api-access-fhjc8" (OuterVolumeSpecName: "kube-api-access-fhjc8") pod "4d8585fc-95e8-43cd-8e92-a659be92cee6" (UID: "4d8585fc-95e8-43cd-8e92-a659be92cee6"). InnerVolumeSpecName "kube-api-access-fhjc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.591354 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-config" (OuterVolumeSpecName: "config") pod "4d8585fc-95e8-43cd-8e92-a659be92cee6" (UID: "4d8585fc-95e8-43cd-8e92-a659be92cee6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.611277 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d8585fc-95e8-43cd-8e92-a659be92cee6" (UID: "4d8585fc-95e8-43cd-8e92-a659be92cee6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.639270 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.639712 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhjc8\" (UniqueName: \"kubernetes.io/projected/4d8585fc-95e8-43cd-8e92-a659be92cee6-kube-api-access-fhjc8\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:08 crc kubenswrapper[4882]: I1002 16:36:08.639725 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8585fc-95e8-43cd-8e92-a659be92cee6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.028473 4882 generic.go:334] "Generic (PLEG): container finished" podID="4d8585fc-95e8-43cd-8e92-a659be92cee6" containerID="13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e" exitCode=0 Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.029827 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.030262 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" event={"ID":"4d8585fc-95e8-43cd-8e92-a659be92cee6","Type":"ContainerDied","Data":"13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e"} Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.030351 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ctxjs" event={"ID":"4d8585fc-95e8-43cd-8e92-a659be92cee6","Type":"ContainerDied","Data":"64cc60dd72f5cbd7a27b0603c3cefbf28093061951e63685ea6e5319b9f90f85"} Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.030374 4882 scope.go:117] "RemoveContainer" containerID="13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.039205 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wsqlw" event={"ID":"21667760-8ee1-456b-af11-a501cdf77822","Type":"ContainerStarted","Data":"8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490"} Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.039389 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wsqlw" event={"ID":"21667760-8ee1-456b-af11-a501cdf77822","Type":"ContainerStarted","Data":"6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547"} Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.039880 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.039925 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.056933 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ctxjs"] Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.060127 4882 scope.go:117] "RemoveContainer" containerID="111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.061163 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ctxjs"] Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.083173 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-wsqlw" podStartSLOduration=16.438659192 podStartE2EDuration="23.083158839s" podCreationTimestamp="2025-10-02 16:35:46 +0000 UTC" firstStartedPulling="2025-10-02 16:35:58.495200399 +0000 UTC m=+1117.244429926" lastFinishedPulling="2025-10-02 16:36:05.139700006 +0000 UTC m=+1123.888929573" observedRunningTime="2025-10-02 16:36:09.081680152 +0000 UTC m=+1127.830909679" watchObservedRunningTime="2025-10-02 16:36:09.083158839 +0000 UTC m=+1127.832388356" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.116536 4882 scope.go:117] "RemoveContainer" containerID="13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e" Oct 02 16:36:09 crc kubenswrapper[4882]: E1002 16:36:09.116903 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e\": container with ID starting with 13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e not found: ID does not exist" containerID="13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.116931 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e"} err="failed to get container status \"13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e\": rpc error: code = NotFound desc = could not find container \"13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e\": container with ID starting with 13738e9d9cd1f8eb0e8033e52c6685263680d1ab3b6f4a3533d2a42623602c1e not found: ID does not exist" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.116951 4882 scope.go:117] "RemoveContainer" containerID="111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7" Oct 02 16:36:09 crc kubenswrapper[4882]: E1002 16:36:09.117723 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7\": container with ID starting with 111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7 not found: ID does not exist" containerID="111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.117796 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7"} err="failed to get container status \"111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7\": rpc error: code = NotFound desc = could not find container \"111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7\": container with ID starting with 111caf1d37cf174d4327ebbed2f6add081b178f8db64b9c12a8278cf594193e7 not found: ID does not exist" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.390855 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.391417 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.490135 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rhsqp"] Oct 02 16:36:09 crc kubenswrapper[4882]: E1002 16:36:09.490485 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8585fc-95e8-43cd-8e92-a659be92cee6" containerName="init" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.490503 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8585fc-95e8-43cd-8e92-a659be92cee6" containerName="init" Oct 02 16:36:09 crc kubenswrapper[4882]: E1002 16:36:09.490530 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8585fc-95e8-43cd-8e92-a659be92cee6" containerName="dnsmasq-dns" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.490537 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8585fc-95e8-43cd-8e92-a659be92cee6" containerName="dnsmasq-dns" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.490718 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8585fc-95e8-43cd-8e92-a659be92cee6" containerName="dnsmasq-dns" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.491427 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.493314 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.497391 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rhsqp"] Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.552036 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovs-rundir\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.552090 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovn-rundir\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.552113 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.552137 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2kq6\" (UniqueName: \"kubernetes.io/projected/92887968-fdd5-4653-a151-70e4a8f963fc-kube-api-access-c2kq6\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.552259 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-combined-ca-bundle\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.552298 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92887968-fdd5-4653-a151-70e4a8f963fc-config\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.653782 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovs-rundir\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.653856 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovn-rundir\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.653883 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.653913 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2kq6\" (UniqueName: \"kubernetes.io/projected/92887968-fdd5-4653-a151-70e4a8f963fc-kube-api-access-c2kq6\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.654009 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-combined-ca-bundle\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.654060 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92887968-fdd5-4653-a151-70e4a8f963fc-config\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.654931 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92887968-fdd5-4653-a151-70e4a8f963fc-config\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.655223 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovs-rundir\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.655277 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovn-rundir\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.677601 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-combined-ca-bundle\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.679566 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.683187 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2kq6\" (UniqueName: \"kubernetes.io/projected/92887968-fdd5-4653-a151-70e4a8f963fc-kube-api-access-c2kq6\") pod \"ovn-controller-metrics-rhsqp\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.686378 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d8b9bcdd7-lprvx"] Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.688012 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.691810 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.695794 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d8b9bcdd7-lprvx"] Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.755437 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfp4n\" (UniqueName: \"kubernetes.io/projected/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-kube-api-access-cfp4n\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.755509 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-dns-svc\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.755566 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.755607 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-config\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.816272 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.857407 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfp4n\" (UniqueName: \"kubernetes.io/projected/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-kube-api-access-cfp4n\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.857492 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-dns-svc\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.857548 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.857596 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-config\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.858362 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-config\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.858971 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-dns-svc\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.859000 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-ovsdbserver-nb\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:09 crc kubenswrapper[4882]: I1002 16:36:09.877183 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfp4n\" (UniqueName: \"kubernetes.io/projected/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-kube-api-access-cfp4n\") pod \"dnsmasq-dns-6d8b9bcdd7-lprvx\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.062454 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.085940 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8b9bcdd7-lprvx"] Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.111791 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84d58dc6cf-bj9kt"] Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.114228 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.116430 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.146948 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84d58dc6cf-bj9kt"] Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.163047 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qh5d\" (UniqueName: \"kubernetes.io/projected/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-kube-api-access-9qh5d\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.163130 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-nb\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.163385 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-sb\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.163460 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-config\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.163484 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-dns-svc\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.265122 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-config\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.265196 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-dns-svc\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.265269 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qh5d\" (UniqueName: \"kubernetes.io/projected/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-kube-api-access-9qh5d\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.265335 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-nb\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.265420 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-sb\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.266382 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-sb\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.267023 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-config\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.268080 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-nb\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.268090 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-dns-svc\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.284707 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qh5d\" (UniqueName: \"kubernetes.io/projected/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-kube-api-access-9qh5d\") pod \"dnsmasq-dns-84d58dc6cf-bj9kt\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.387998 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.388373 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.433170 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:10 crc kubenswrapper[4882]: I1002 16:36:10.773111 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8585fc-95e8-43cd-8e92-a659be92cee6" path="/var/lib/kubelet/pods/4d8585fc-95e8-43cd-8e92-a659be92cee6/volumes" Oct 02 16:36:11 crc kubenswrapper[4882]: I1002 16:36:11.410144 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 16:36:11 crc kubenswrapper[4882]: I1002 16:36:11.410204 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 16:36:11 crc kubenswrapper[4882]: I1002 16:36:11.513154 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 16:36:11 crc kubenswrapper[4882]: I1002 16:36:11.692711 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 16:36:12 crc kubenswrapper[4882]: I1002 16:36:12.109431 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 16:36:12 crc kubenswrapper[4882]: I1002 16:36:12.474397 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 16:36:12 crc kubenswrapper[4882]: I1002 16:36:12.583876 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 16:36:12 crc kubenswrapper[4882]: I1002 16:36:12.916037 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8b9bcdd7-lprvx"] Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.000444 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rhsqp"] Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.016228 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84d58dc6cf-bj9kt"] Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.305027 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84d58dc6cf-bj9kt"] Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.309747 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.352131 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59775c759f-z44vx"] Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.356538 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.367613 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59775c759f-z44vx"] Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.542404 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-nb\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.542477 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-sb\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.542508 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc69v\" (UniqueName: \"kubernetes.io/projected/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-kube-api-access-hc69v\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.542526 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-dns-svc\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.542566 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-config\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.644909 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-nb\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.645005 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-sb\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.645054 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc69v\" (UniqueName: \"kubernetes.io/projected/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-kube-api-access-hc69v\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.645095 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-dns-svc\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.645142 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-config\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.646256 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-sb\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.646465 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-nb\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.646593 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-config\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.647242 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-dns-svc\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.672769 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc69v\" (UniqueName: \"kubernetes.io/projected/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-kube-api-access-hc69v\") pod \"dnsmasq-dns-59775c759f-z44vx\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:13 crc kubenswrapper[4882]: I1002 16:36:13.689904 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.084113 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" event={"ID":"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f","Type":"ContainerStarted","Data":"3ee5ed19f32c0a694d51378f047a238dedca4916de0843d28679d60758dc0684"} Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.084938 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" event={"ID":"1a102fad-3c46-4428-9c16-b2dcf62c9cf1","Type":"ContainerStarted","Data":"af7b55ec6b22d008c9b768a9164067af0febd7280679d022a54d5a707aa61c09"} Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.086199 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rhsqp" event={"ID":"92887968-fdd5-4653-a151-70e4a8f963fc","Type":"ContainerStarted","Data":"fbac9db1e92df8c7a67eb7a16d69de9b0b4b044771209e8e76103bb854fbf827"} Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.498308 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.503287 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.505422 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.505656 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.506469 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.506975 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-8pd6n" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.556535 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.660319 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.660420 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-lock\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.660445 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-cache\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.660490 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.660558 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjmgj\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-kube-api-access-tjmgj\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.761617 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.761682 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjmgj\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-kube-api-access-tjmgj\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.761721 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: E1002 16:36:14.761749 4882 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.761769 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-lock\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: E1002 16:36:14.761772 4882 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.761788 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-cache\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: E1002 16:36:14.761831 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift podName:9cd26acb-1d48-48f3-b39d-b274bdcd3cce nodeName:}" failed. No retries permitted until 2025-10-02 16:36:15.261811662 +0000 UTC m=+1134.011041189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift") pod "swift-storage-0" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce") : configmap "swift-ring-files" not found Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.762271 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-cache\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.762276 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.762462 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-lock\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.784336 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:14 crc kubenswrapper[4882]: I1002 16:36:14.785231 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjmgj\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-kube-api-access-tjmgj\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:15 crc kubenswrapper[4882]: I1002 16:36:15.272118 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:15 crc kubenswrapper[4882]: E1002 16:36:15.272497 4882 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 16:36:15 crc kubenswrapper[4882]: E1002 16:36:15.272517 4882 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 16:36:15 crc kubenswrapper[4882]: E1002 16:36:15.272584 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift podName:9cd26acb-1d48-48f3-b39d-b274bdcd3cce nodeName:}" failed. No retries permitted until 2025-10-02 16:36:16.272562867 +0000 UTC m=+1135.021792394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift") pod "swift-storage-0" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce") : configmap "swift-ring-files" not found Oct 02 16:36:16 crc kubenswrapper[4882]: I1002 16:36:16.288278 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:16 crc kubenswrapper[4882]: E1002 16:36:16.288552 4882 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 16:36:16 crc kubenswrapper[4882]: E1002 16:36:16.289440 4882 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 16:36:16 crc kubenswrapper[4882]: E1002 16:36:16.289541 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift podName:9cd26acb-1d48-48f3-b39d-b274bdcd3cce nodeName:}" failed. No retries permitted until 2025-10-02 16:36:18.289516795 +0000 UTC m=+1137.038746322 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift") pod "swift-storage-0" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce") : configmap "swift-ring-files" not found Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.328332 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:18 crc kubenswrapper[4882]: E1002 16:36:18.328546 4882 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 16:36:18 crc kubenswrapper[4882]: E1002 16:36:18.329032 4882 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 16:36:18 crc kubenswrapper[4882]: E1002 16:36:18.329109 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift podName:9cd26acb-1d48-48f3-b39d-b274bdcd3cce nodeName:}" failed. No retries permitted until 2025-10-02 16:36:22.329089881 +0000 UTC m=+1141.078319408 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift") pod "swift-storage-0" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce") : configmap "swift-ring-files" not found Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.415270 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nx9bj"] Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.418120 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.424659 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.425021 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.425290 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.429645 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nx9bj"] Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.432564 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-scripts\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.432614 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-combined-ca-bundle\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.432640 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3616734-9206-483f-a173-6fa0dffe1f82-etc-swift\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.432741 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-dispersionconf\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.432913 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-swiftconf\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.432966 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtn62\" (UniqueName: \"kubernetes.io/projected/f3616734-9206-483f-a173-6fa0dffe1f82-kube-api-access-mtn62\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.433031 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-ring-data-devices\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.534436 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-ring-data-devices\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.534541 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-scripts\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.534572 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-combined-ca-bundle\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.534596 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3616734-9206-483f-a173-6fa0dffe1f82-etc-swift\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.534626 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-dispersionconf\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.534686 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-swiftconf\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.534729 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtn62\" (UniqueName: \"kubernetes.io/projected/f3616734-9206-483f-a173-6fa0dffe1f82-kube-api-access-mtn62\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.535148 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3616734-9206-483f-a173-6fa0dffe1f82-etc-swift\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.535493 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-ring-data-devices\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.535581 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-scripts\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.540576 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-dispersionconf\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.540579 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-swiftconf\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.541257 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-combined-ca-bundle\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.556616 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtn62\" (UniqueName: \"kubernetes.io/projected/f3616734-9206-483f-a173-6fa0dffe1f82-kube-api-access-mtn62\") pod \"swift-ring-rebalance-nx9bj\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.691340 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59775c759f-z44vx"] Oct 02 16:36:18 crc kubenswrapper[4882]: W1002 16:36:18.696703 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16d7f9d5_95d8_497b_8877_3f4c9e1dfc3b.slice/crio-501e5f43c1802e62ab6e7033f8484981b2c719ab78378961cfb0bd8dca134a9b WatchSource:0}: Error finding container 501e5f43c1802e62ab6e7033f8484981b2c719ab78378961cfb0bd8dca134a9b: Status 404 returned error can't find the container with id 501e5f43c1802e62ab6e7033f8484981b2c719ab78378961cfb0bd8dca134a9b Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.743123 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:18 crc kubenswrapper[4882]: I1002 16:36:18.964382 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nx9bj"] Oct 02 16:36:18 crc kubenswrapper[4882]: W1002 16:36:18.972644 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3616734_9206_483f_a173_6fa0dffe1f82.slice/crio-5d2236cd256202f026477b5e3628419454cd5c9d4d13f8ea65cc533bd29867f9 WatchSource:0}: Error finding container 5d2236cd256202f026477b5e3628419454cd5c9d4d13f8ea65cc533bd29867f9: Status 404 returned error can't find the container with id 5d2236cd256202f026477b5e3628419454cd5c9d4d13f8ea65cc533bd29867f9 Oct 02 16:36:19 crc kubenswrapper[4882]: I1002 16:36:19.126675 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59775c759f-z44vx" event={"ID":"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b","Type":"ContainerStarted","Data":"501e5f43c1802e62ab6e7033f8484981b2c719ab78378961cfb0bd8dca134a9b"} Oct 02 16:36:19 crc kubenswrapper[4882]: I1002 16:36:19.128058 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nx9bj" event={"ID":"f3616734-9206-483f-a173-6fa0dffe1f82","Type":"ContainerStarted","Data":"5d2236cd256202f026477b5e3628419454cd5c9d4d13f8ea65cc533bd29867f9"} Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.318441 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-b49v2"] Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.319855 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b49v2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.329072 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-b49v2"] Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.384581 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65cs\" (UniqueName: \"kubernetes.io/projected/a9308bac-63a2-4c6a-88b5-23d8083feac8-kube-api-access-w65cs\") pod \"keystone-db-create-b49v2\" (UID: \"a9308bac-63a2-4c6a-88b5-23d8083feac8\") " pod="openstack/keystone-db-create-b49v2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.477846 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jnfn2"] Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.479171 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jnfn2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.486716 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65cs\" (UniqueName: \"kubernetes.io/projected/a9308bac-63a2-4c6a-88b5-23d8083feac8-kube-api-access-w65cs\") pod \"keystone-db-create-b49v2\" (UID: \"a9308bac-63a2-4c6a-88b5-23d8083feac8\") " pod="openstack/keystone-db-create-b49v2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.487007 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jnfn2"] Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.514950 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65cs\" (UniqueName: \"kubernetes.io/projected/a9308bac-63a2-4c6a-88b5-23d8083feac8-kube-api-access-w65cs\") pod \"keystone-db-create-b49v2\" (UID: \"a9308bac-63a2-4c6a-88b5-23d8083feac8\") " pod="openstack/keystone-db-create-b49v2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.590188 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj269\" (UniqueName: \"kubernetes.io/projected/6c1352ce-9050-4709-bffd-c834b8ef1cb0-kube-api-access-mj269\") pod \"placement-db-create-jnfn2\" (UID: \"6c1352ce-9050-4709-bffd-c834b8ef1cb0\") " pod="openstack/placement-db-create-jnfn2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.650051 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b49v2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.691280 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj269\" (UniqueName: \"kubernetes.io/projected/6c1352ce-9050-4709-bffd-c834b8ef1cb0-kube-api-access-mj269\") pod \"placement-db-create-jnfn2\" (UID: \"6c1352ce-9050-4709-bffd-c834b8ef1cb0\") " pod="openstack/placement-db-create-jnfn2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.709960 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj269\" (UniqueName: \"kubernetes.io/projected/6c1352ce-9050-4709-bffd-c834b8ef1cb0-kube-api-access-mj269\") pod \"placement-db-create-jnfn2\" (UID: \"6c1352ce-9050-4709-bffd-c834b8ef1cb0\") " pod="openstack/placement-db-create-jnfn2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.797772 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jnfn2" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.846678 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qdmww"] Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.847752 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qdmww" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.857588 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qdmww"] Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.894837 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxxrd\" (UniqueName: \"kubernetes.io/projected/58c9678e-5c7a-4a53-9e13-732c9bbc2cba-kube-api-access-kxxrd\") pod \"glance-db-create-qdmww\" (UID: \"58c9678e-5c7a-4a53-9e13-732c9bbc2cba\") " pod="openstack/glance-db-create-qdmww" Oct 02 16:36:21 crc kubenswrapper[4882]: I1002 16:36:21.998050 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxxrd\" (UniqueName: \"kubernetes.io/projected/58c9678e-5c7a-4a53-9e13-732c9bbc2cba-kube-api-access-kxxrd\") pod \"glance-db-create-qdmww\" (UID: \"58c9678e-5c7a-4a53-9e13-732c9bbc2cba\") " pod="openstack/glance-db-create-qdmww" Oct 02 16:36:22 crc kubenswrapper[4882]: I1002 16:36:22.014817 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxxrd\" (UniqueName: \"kubernetes.io/projected/58c9678e-5c7a-4a53-9e13-732c9bbc2cba-kube-api-access-kxxrd\") pod \"glance-db-create-qdmww\" (UID: \"58c9678e-5c7a-4a53-9e13-732c9bbc2cba\") " pod="openstack/glance-db-create-qdmww" Oct 02 16:36:22 crc kubenswrapper[4882]: I1002 16:36:22.087206 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-b49v2"] Oct 02 16:36:22 crc kubenswrapper[4882]: W1002 16:36:22.100378 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9308bac_63a2_4c6a_88b5_23d8083feac8.slice/crio-9014192e44997c5cc847df924744d015087f1077ff082898ca2b2c586ed7ec7d WatchSource:0}: Error finding container 9014192e44997c5cc847df924744d015087f1077ff082898ca2b2c586ed7ec7d: Status 404 returned error can't find the container with id 9014192e44997c5cc847df924744d015087f1077ff082898ca2b2c586ed7ec7d Oct 02 16:36:22 crc kubenswrapper[4882]: I1002 16:36:22.152818 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b49v2" event={"ID":"a9308bac-63a2-4c6a-88b5-23d8083feac8","Type":"ContainerStarted","Data":"9014192e44997c5cc847df924744d015087f1077ff082898ca2b2c586ed7ec7d"} Oct 02 16:36:22 crc kubenswrapper[4882]: I1002 16:36:22.189407 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qdmww" Oct 02 16:36:22 crc kubenswrapper[4882]: I1002 16:36:22.230665 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jnfn2"] Oct 02 16:36:22 crc kubenswrapper[4882]: W1002 16:36:22.234418 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c1352ce_9050_4709_bffd_c834b8ef1cb0.slice/crio-5a469cc11f83540e892306c66ee2189d684553dc9d12a89d6d2cbdaea31b66a0 WatchSource:0}: Error finding container 5a469cc11f83540e892306c66ee2189d684553dc9d12a89d6d2cbdaea31b66a0: Status 404 returned error can't find the container with id 5a469cc11f83540e892306c66ee2189d684553dc9d12a89d6d2cbdaea31b66a0 Oct 02 16:36:22 crc kubenswrapper[4882]: I1002 16:36:22.405252 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:22 crc kubenswrapper[4882]: E1002 16:36:22.405558 4882 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 16:36:22 crc kubenswrapper[4882]: E1002 16:36:22.405595 4882 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 16:36:22 crc kubenswrapper[4882]: E1002 16:36:22.405664 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift podName:9cd26acb-1d48-48f3-b39d-b274bdcd3cce nodeName:}" failed. No retries permitted until 2025-10-02 16:36:30.40564356 +0000 UTC m=+1149.154873087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift") pod "swift-storage-0" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce") : configmap "swift-ring-files" not found Oct 02 16:36:22 crc kubenswrapper[4882]: I1002 16:36:22.642435 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qdmww"] Oct 02 16:36:23 crc kubenswrapper[4882]: I1002 16:36:23.169541 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jnfn2" event={"ID":"6c1352ce-9050-4709-bffd-c834b8ef1cb0","Type":"ContainerStarted","Data":"5a469cc11f83540e892306c66ee2189d684553dc9d12a89d6d2cbdaea31b66a0"} Oct 02 16:36:23 crc kubenswrapper[4882]: I1002 16:36:23.171552 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qdmww" event={"ID":"58c9678e-5c7a-4a53-9e13-732c9bbc2cba","Type":"ContainerStarted","Data":"7bb9964dff65110964c8a4ceabf0658c3f17652871f58d3d84ab5887aeb70755"} Oct 02 16:36:25 crc kubenswrapper[4882]: I1002 16:36:25.198385 4882 generic.go:334] "Generic (PLEG): container finished" podID="a9308bac-63a2-4c6a-88b5-23d8083feac8" containerID="304cdf18a2c81783fa09ece0a20e7edbd4a65aacd48e7453c72803ed54eaa9cd" exitCode=0 Oct 02 16:36:25 crc kubenswrapper[4882]: I1002 16:36:25.198586 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b49v2" event={"ID":"a9308bac-63a2-4c6a-88b5-23d8083feac8","Type":"ContainerDied","Data":"304cdf18a2c81783fa09ece0a20e7edbd4a65aacd48e7453c72803ed54eaa9cd"} Oct 02 16:36:25 crc kubenswrapper[4882]: I1002 16:36:25.208643 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jnfn2" event={"ID":"6c1352ce-9050-4709-bffd-c834b8ef1cb0","Type":"ContainerStarted","Data":"9cda4521dc256012ae948d734158af8a41aa53200ae9a9157a93bd3abc5344ef"} Oct 02 16:36:25 crc kubenswrapper[4882]: I1002 16:36:25.211117 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qdmww" event={"ID":"58c9678e-5c7a-4a53-9e13-732c9bbc2cba","Type":"ContainerStarted","Data":"052368bb59f9409034eaecd11389763ff8b3157c6e295d8f7e64b4884fd743eb"} Oct 02 16:36:25 crc kubenswrapper[4882]: I1002 16:36:25.230380 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-qdmww" podStartSLOduration=4.230363142 podStartE2EDuration="4.230363142s" podCreationTimestamp="2025-10-02 16:36:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:36:25.228162107 +0000 UTC m=+1143.977391634" watchObservedRunningTime="2025-10-02 16:36:25.230363142 +0000 UTC m=+1143.979592669" Oct 02 16:36:25 crc kubenswrapper[4882]: I1002 16:36:25.249598 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-jnfn2" podStartSLOduration=4.249577728 podStartE2EDuration="4.249577728s" podCreationTimestamp="2025-10-02 16:36:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:36:25.243567396 +0000 UTC m=+1143.992796923" watchObservedRunningTime="2025-10-02 16:36:25.249577728 +0000 UTC m=+1143.998807255" Oct 02 16:36:26 crc kubenswrapper[4882]: I1002 16:36:26.219714 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7132edf6-8a37-4230-a3a5-4703be721a78","Type":"ContainerStarted","Data":"1c434c360e363c534658d517b5e5ca0981d70ee9a4edbc3206e881217869a5a9"} Oct 02 16:36:26 crc kubenswrapper[4882]: I1002 16:36:26.222025 4882 generic.go:334] "Generic (PLEG): container finished" podID="58c9678e-5c7a-4a53-9e13-732c9bbc2cba" containerID="052368bb59f9409034eaecd11389763ff8b3157c6e295d8f7e64b4884fd743eb" exitCode=0 Oct 02 16:36:26 crc kubenswrapper[4882]: I1002 16:36:26.222070 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qdmww" event={"ID":"58c9678e-5c7a-4a53-9e13-732c9bbc2cba","Type":"ContainerDied","Data":"052368bb59f9409034eaecd11389763ff8b3157c6e295d8f7e64b4884fd743eb"} Oct 02 16:36:26 crc kubenswrapper[4882]: I1002 16:36:26.251072 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.574648875 podStartE2EDuration="37.251051533s" podCreationTimestamp="2025-10-02 16:35:49 +0000 UTC" firstStartedPulling="2025-10-02 16:35:58.325361203 +0000 UTC m=+1117.074590730" lastFinishedPulling="2025-10-02 16:36:25.001763861 +0000 UTC m=+1143.750993388" observedRunningTime="2025-10-02 16:36:26.239990883 +0000 UTC m=+1144.989220430" watchObservedRunningTime="2025-10-02 16:36:26.251051533 +0000 UTC m=+1145.000281060" Oct 02 16:36:26 crc kubenswrapper[4882]: I1002 16:36:26.827032 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 16:36:26 crc kubenswrapper[4882]: I1002 16:36:26.882512 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 16:36:26 crc kubenswrapper[4882]: I1002 16:36:26.905674 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b49v2" Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.076739 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w65cs\" (UniqueName: \"kubernetes.io/projected/a9308bac-63a2-4c6a-88b5-23d8083feac8-kube-api-access-w65cs\") pod \"a9308bac-63a2-4c6a-88b5-23d8083feac8\" (UID: \"a9308bac-63a2-4c6a-88b5-23d8083feac8\") " Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.082091 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9308bac-63a2-4c6a-88b5-23d8083feac8-kube-api-access-w65cs" (OuterVolumeSpecName: "kube-api-access-w65cs") pod "a9308bac-63a2-4c6a-88b5-23d8083feac8" (UID: "a9308bac-63a2-4c6a-88b5-23d8083feac8"). InnerVolumeSpecName "kube-api-access-w65cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.179253 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w65cs\" (UniqueName: \"kubernetes.io/projected/a9308bac-63a2-4c6a-88b5-23d8083feac8-kube-api-access-w65cs\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.232514 4882 generic.go:334] "Generic (PLEG): container finished" podID="6c1352ce-9050-4709-bffd-c834b8ef1cb0" containerID="9cda4521dc256012ae948d734158af8a41aa53200ae9a9157a93bd3abc5344ef" exitCode=0 Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.232603 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jnfn2" event={"ID":"6c1352ce-9050-4709-bffd-c834b8ef1cb0","Type":"ContainerDied","Data":"9cda4521dc256012ae948d734158af8a41aa53200ae9a9157a93bd3abc5344ef"} Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.237745 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b49v2" Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.239137 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b49v2" event={"ID":"a9308bac-63a2-4c6a-88b5-23d8083feac8","Type":"ContainerDied","Data":"9014192e44997c5cc847df924744d015087f1077ff082898ca2b2c586ed7ec7d"} Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.239188 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9014192e44997c5cc847df924744d015087f1077ff082898ca2b2c586ed7ec7d" Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.239231 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 16:36:27 crc kubenswrapper[4882]: I1002 16:36:27.319376 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 16:36:30 crc kubenswrapper[4882]: I1002 16:36:30.268740 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qdmww" event={"ID":"58c9678e-5c7a-4a53-9e13-732c9bbc2cba","Type":"ContainerDied","Data":"7bb9964dff65110964c8a4ceabf0658c3f17652871f58d3d84ab5887aeb70755"} Oct 02 16:36:30 crc kubenswrapper[4882]: I1002 16:36:30.269352 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bb9964dff65110964c8a4ceabf0658c3f17652871f58d3d84ab5887aeb70755" Oct 02 16:36:30 crc kubenswrapper[4882]: I1002 16:36:30.308954 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qdmww" Oct 02 16:36:30 crc kubenswrapper[4882]: I1002 16:36:30.434190 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxxrd\" (UniqueName: \"kubernetes.io/projected/58c9678e-5c7a-4a53-9e13-732c9bbc2cba-kube-api-access-kxxrd\") pod \"58c9678e-5c7a-4a53-9e13-732c9bbc2cba\" (UID: \"58c9678e-5c7a-4a53-9e13-732c9bbc2cba\") " Oct 02 16:36:30 crc kubenswrapper[4882]: I1002 16:36:30.434722 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:30 crc kubenswrapper[4882]: E1002 16:36:30.434966 4882 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 16:36:30 crc kubenswrapper[4882]: E1002 16:36:30.434993 4882 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 16:36:30 crc kubenswrapper[4882]: E1002 16:36:30.435046 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift podName:9cd26acb-1d48-48f3-b39d-b274bdcd3cce nodeName:}" failed. No retries permitted until 2025-10-02 16:36:46.435028329 +0000 UTC m=+1165.184257856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift") pod "swift-storage-0" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce") : configmap "swift-ring-files" not found Oct 02 16:36:30 crc kubenswrapper[4882]: I1002 16:36:30.441891 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c9678e-5c7a-4a53-9e13-732c9bbc2cba-kube-api-access-kxxrd" (OuterVolumeSpecName: "kube-api-access-kxxrd") pod "58c9678e-5c7a-4a53-9e13-732c9bbc2cba" (UID: "58c9678e-5c7a-4a53-9e13-732c9bbc2cba"). InnerVolumeSpecName "kube-api-access-kxxrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:30 crc kubenswrapper[4882]: I1002 16:36:30.536420 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxxrd\" (UniqueName: \"kubernetes.io/projected/58c9678e-5c7a-4a53-9e13-732c9bbc2cba-kube-api-access-kxxrd\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.202480 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jnfn2" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.276547 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jnfn2" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.276550 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qdmww" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.276546 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jnfn2" event={"ID":"6c1352ce-9050-4709-bffd-c834b8ef1cb0","Type":"ContainerDied","Data":"5a469cc11f83540e892306c66ee2189d684553dc9d12a89d6d2cbdaea31b66a0"} Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.277088 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a469cc11f83540e892306c66ee2189d684553dc9d12a89d6d2cbdaea31b66a0" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.349424 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj269\" (UniqueName: \"kubernetes.io/projected/6c1352ce-9050-4709-bffd-c834b8ef1cb0-kube-api-access-mj269\") pod \"6c1352ce-9050-4709-bffd-c834b8ef1cb0\" (UID: \"6c1352ce-9050-4709-bffd-c834b8ef1cb0\") " Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.353036 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1352ce-9050-4709-bffd-c834b8ef1cb0-kube-api-access-mj269" (OuterVolumeSpecName: "kube-api-access-mj269") pod "6c1352ce-9050-4709-bffd-c834b8ef1cb0" (UID: "6c1352ce-9050-4709-bffd-c834b8ef1cb0"). InnerVolumeSpecName "kube-api-access-mj269". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.451781 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj269\" (UniqueName: \"kubernetes.io/projected/6c1352ce-9050-4709-bffd-c834b8ef1cb0-kube-api-access-mj269\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.964591 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c9ca-account-create-tgl7f"] Oct 02 16:36:31 crc kubenswrapper[4882]: E1002 16:36:31.965134 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9308bac-63a2-4c6a-88b5-23d8083feac8" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.965159 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9308bac-63a2-4c6a-88b5-23d8083feac8" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: E1002 16:36:31.965193 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1352ce-9050-4709-bffd-c834b8ef1cb0" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.965204 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1352ce-9050-4709-bffd-c834b8ef1cb0" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: E1002 16:36:31.965250 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c9678e-5c7a-4a53-9e13-732c9bbc2cba" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.965261 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c9678e-5c7a-4a53-9e13-732c9bbc2cba" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.965518 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1352ce-9050-4709-bffd-c834b8ef1cb0" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.965550 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9308bac-63a2-4c6a-88b5-23d8083feac8" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.965567 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c9678e-5c7a-4a53-9e13-732c9bbc2cba" containerName="mariadb-database-create" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.966376 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c9ca-account-create-tgl7f" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.969524 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 16:36:31 crc kubenswrapper[4882]: I1002 16:36:31.973982 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c9ca-account-create-tgl7f"] Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.061436 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx4w9\" (UniqueName: \"kubernetes.io/projected/29c3b95d-74fa-4e1d-a3ed-422750068a7d-kube-api-access-sx4w9\") pod \"glance-c9ca-account-create-tgl7f\" (UID: \"29c3b95d-74fa-4e1d-a3ed-422750068a7d\") " pod="openstack/glance-c9ca-account-create-tgl7f" Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.163017 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx4w9\" (UniqueName: \"kubernetes.io/projected/29c3b95d-74fa-4e1d-a3ed-422750068a7d-kube-api-access-sx4w9\") pod \"glance-c9ca-account-create-tgl7f\" (UID: \"29c3b95d-74fa-4e1d-a3ed-422750068a7d\") " pod="openstack/glance-c9ca-account-create-tgl7f" Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.184332 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx4w9\" (UniqueName: \"kubernetes.io/projected/29c3b95d-74fa-4e1d-a3ed-422750068a7d-kube-api-access-sx4w9\") pod \"glance-c9ca-account-create-tgl7f\" (UID: \"29c3b95d-74fa-4e1d-a3ed-422750068a7d\") " pod="openstack/glance-c9ca-account-create-tgl7f" Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.284065 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c9ca-account-create-tgl7f" Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.287079 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"142d02f0-5616-42b6-b6fc-b37df2639f8a","Type":"ContainerStarted","Data":"7b2b5dfddc7012648f92825b22fc3dff9d26b581f27c153da193dcb7c40d999c"} Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.290629 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad4f3fde-e95f-404d-baac-1c6238494afa","Type":"ContainerDied","Data":"0d67f1f0413558904549638569857532409de73c996c52a84ea4614ae2ae72a3"} Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.290551 4882 generic.go:334] "Generic (PLEG): container finished" podID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerID="0d67f1f0413558904549638569857532409de73c996c52a84ea4614ae2ae72a3" exitCode=0 Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.297615 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nx9bj" event={"ID":"f3616734-9206-483f-a173-6fa0dffe1f82","Type":"ContainerStarted","Data":"36cab3f3790ff0ea86f4b5660fdfefeef5565c1d5b18cf3c3eaa32ba022e195f"} Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.322932 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=23.958351478 podStartE2EDuration="44.32291291s" podCreationTimestamp="2025-10-02 16:35:48 +0000 UTC" firstStartedPulling="2025-10-02 16:35:57.757348 +0000 UTC m=+1116.506577527" lastFinishedPulling="2025-10-02 16:36:18.121909422 +0000 UTC m=+1136.871138959" observedRunningTime="2025-10-02 16:36:32.31305035 +0000 UTC m=+1151.062279877" watchObservedRunningTime="2025-10-02 16:36:32.32291291 +0000 UTC m=+1151.072142437" Oct 02 16:36:32 crc kubenswrapper[4882]: I1002 16:36:32.734854 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c9ca-account-create-tgl7f"] Oct 02 16:36:32 crc kubenswrapper[4882]: W1002 16:36:32.743091 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29c3b95d_74fa_4e1d_a3ed_422750068a7d.slice/crio-800e8d0e177d6d5551e0116f3c345b5bef208981c505bfd8badfb3c58356a9e7 WatchSource:0}: Error finding container 800e8d0e177d6d5551e0116f3c345b5bef208981c505bfd8badfb3c58356a9e7: Status 404 returned error can't find the container with id 800e8d0e177d6d5551e0116f3c345b5bef208981c505bfd8badfb3c58356a9e7 Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.305549 4882 generic.go:334] "Generic (PLEG): container finished" podID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerID="56f5e136c1bbeef542d74f85786e2eb58e6cd2e6742745adf39c24f6ed827f4d" exitCode=0 Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.305599 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42","Type":"ContainerDied","Data":"56f5e136c1bbeef542d74f85786e2eb58e6cd2e6742745adf39c24f6ed827f4d"} Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.315660 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rhsqp" event={"ID":"92887968-fdd5-4653-a151-70e4a8f963fc","Type":"ContainerStarted","Data":"6bc9f5c33140d2b4ca0bbd08d5002572bc61b1f513ed811d7b913685891c44e2"} Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.320643 4882 generic.go:334] "Generic (PLEG): container finished" podID="29c3b95d-74fa-4e1d-a3ed-422750068a7d" containerID="ab30ebb2f1646bffd938b73b39cd780a7082a80b77005b3a0bee01532eaec10e" exitCode=0 Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.320734 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c9ca-account-create-tgl7f" event={"ID":"29c3b95d-74fa-4e1d-a3ed-422750068a7d","Type":"ContainerDied","Data":"ab30ebb2f1646bffd938b73b39cd780a7082a80b77005b3a0bee01532eaec10e"} Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.320781 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c9ca-account-create-tgl7f" event={"ID":"29c3b95d-74fa-4e1d-a3ed-422750068a7d","Type":"ContainerStarted","Data":"800e8d0e177d6d5551e0116f3c345b5bef208981c505bfd8badfb3c58356a9e7"} Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.324664 4882 generic.go:334] "Generic (PLEG): container finished" podID="25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" containerID="b289df3ddb4c1bcb398ae135633bb97d64cf4278d1e06f6d881906695cdc5300" exitCode=0 Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.324761 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" event={"ID":"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f","Type":"ContainerDied","Data":"b289df3ddb4c1bcb398ae135633bb97d64cf4278d1e06f6d881906695cdc5300"} Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.329013 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad4f3fde-e95f-404d-baac-1c6238494afa","Type":"ContainerStarted","Data":"578c94adafe095435dbcb87f647011c96a388fb1edfe93dbe48b4ca0639d68b2"} Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.330087 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.333158 4882 generic.go:334] "Generic (PLEG): container finished" podID="1a102fad-3c46-4428-9c16-b2dcf62c9cf1" containerID="a7001c126d25ed0bf1e0520e64bfa95701219ee1571177a99e77a2b45034cc2f" exitCode=0 Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.333446 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" event={"ID":"1a102fad-3c46-4428-9c16-b2dcf62c9cf1","Type":"ContainerDied","Data":"a7001c126d25ed0bf1e0520e64bfa95701219ee1571177a99e77a2b45034cc2f"} Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.336187 4882 generic.go:334] "Generic (PLEG): container finished" podID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerID="8eb0dfd758bae7ec26875ce74450792d723e471e7f925fd15d77e95aace4a091" exitCode=0 Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.336365 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59775c759f-z44vx" event={"ID":"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b","Type":"ContainerDied","Data":"8eb0dfd758bae7ec26875ce74450792d723e471e7f925fd15d77e95aace4a091"} Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.401548 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rhsqp" podStartSLOduration=24.401525296 podStartE2EDuration="24.401525296s" podCreationTimestamp="2025-10-02 16:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:36:33.384532886 +0000 UTC m=+1152.133762413" watchObservedRunningTime="2025-10-02 16:36:33.401525296 +0000 UTC m=+1152.150754823" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.496245 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.913570819 podStartE2EDuration="56.49620569s" podCreationTimestamp="2025-10-02 16:35:37 +0000 UTC" firstStartedPulling="2025-10-02 16:35:39.385422577 +0000 UTC m=+1098.134652114" lastFinishedPulling="2025-10-02 16:35:57.968057458 +0000 UTC m=+1116.717286985" observedRunningTime="2025-10-02 16:36:33.473540867 +0000 UTC m=+1152.222770394" watchObservedRunningTime="2025-10-02 16:36:33.49620569 +0000 UTC m=+1152.245435217" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.533557 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nx9bj" podStartSLOduration=3.307387828 podStartE2EDuration="15.533534625s" podCreationTimestamp="2025-10-02 16:36:18 +0000 UTC" firstStartedPulling="2025-10-02 16:36:18.97599702 +0000 UTC m=+1137.725226547" lastFinishedPulling="2025-10-02 16:36:31.202143817 +0000 UTC m=+1149.951373344" observedRunningTime="2025-10-02 16:36:33.51872365 +0000 UTC m=+1152.267953177" watchObservedRunningTime="2025-10-02 16:36:33.533534625 +0000 UTC m=+1152.282764152" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.738618 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.786315 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.794468 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-dns-svc\") pod \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.794542 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-ovsdbserver-nb\") pod \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.794602 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfp4n\" (UniqueName: \"kubernetes.io/projected/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-kube-api-access-cfp4n\") pod \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.794778 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-config\") pod \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\" (UID: \"1a102fad-3c46-4428-9c16-b2dcf62c9cf1\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.800273 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-kube-api-access-cfp4n" (OuterVolumeSpecName: "kube-api-access-cfp4n") pod "1a102fad-3c46-4428-9c16-b2dcf62c9cf1" (UID: "1a102fad-3c46-4428-9c16-b2dcf62c9cf1"). InnerVolumeSpecName "kube-api-access-cfp4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.817955 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a102fad-3c46-4428-9c16-b2dcf62c9cf1" (UID: "1a102fad-3c46-4428-9c16-b2dcf62c9cf1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.818554 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a102fad-3c46-4428-9c16-b2dcf62c9cf1" (UID: "1a102fad-3c46-4428-9c16-b2dcf62c9cf1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.818984 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-config" (OuterVolumeSpecName: "config") pod "1a102fad-3c46-4428-9c16-b2dcf62c9cf1" (UID: "1a102fad-3c46-4428-9c16-b2dcf62c9cf1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.896914 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-sb\") pod \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.897015 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-nb\") pod \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.897085 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qh5d\" (UniqueName: \"kubernetes.io/projected/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-kube-api-access-9qh5d\") pod \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.897139 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-dns-svc\") pod \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.897164 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-config\") pod \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\" (UID: \"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f\") " Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.897622 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.897647 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.897661 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.897677 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfp4n\" (UniqueName: \"kubernetes.io/projected/1a102fad-3c46-4428-9c16-b2dcf62c9cf1-kube-api-access-cfp4n\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.901097 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-kube-api-access-9qh5d" (OuterVolumeSpecName: "kube-api-access-9qh5d") pod "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" (UID: "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f"). InnerVolumeSpecName "kube-api-access-9qh5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.917669 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-config" (OuterVolumeSpecName: "config") pod "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" (UID: "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.921358 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" (UID: "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.921539 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" (UID: "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.923992 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" (UID: "25b3f44c-4a87-4e94-b7f3-a7f07752ea6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.999471 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.999513 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.999529 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qh5d\" (UniqueName: \"kubernetes.io/projected/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-kube-api-access-9qh5d\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.999544 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:33 crc kubenswrapper[4882]: I1002 16:36:33.999555 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.345440 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42","Type":"ContainerStarted","Data":"5681a9da320379f932adf71053bca8a31e077d6a1f4a09e0b83765c5625e4ade"} Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.346286 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.346978 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" event={"ID":"25b3f44c-4a87-4e94-b7f3-a7f07752ea6f","Type":"ContainerDied","Data":"3ee5ed19f32c0a694d51378f047a238dedca4916de0843d28679d60758dc0684"} Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.347022 4882 scope.go:117] "RemoveContainer" containerID="b289df3ddb4c1bcb398ae135633bb97d64cf4278d1e06f6d881906695cdc5300" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.347109 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d58dc6cf-bj9kt" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.350053 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" event={"ID":"1a102fad-3c46-4428-9c16-b2dcf62c9cf1","Type":"ContainerDied","Data":"af7b55ec6b22d008c9b768a9164067af0febd7280679d022a54d5a707aa61c09"} Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.350063 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d8b9bcdd7-lprvx" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.353971 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59775c759f-z44vx" event={"ID":"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b","Type":"ContainerStarted","Data":"a7f79ff6b4f0d3f0eeb3caccd7cd76e6e7917496e6971694dbe19c7d4b3edc27"} Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.354294 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.371625 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.962464812 podStartE2EDuration="57.371605547s" podCreationTimestamp="2025-10-02 16:35:37 +0000 UTC" firstStartedPulling="2025-10-02 16:35:39.560322118 +0000 UTC m=+1098.309551645" lastFinishedPulling="2025-10-02 16:35:57.969462853 +0000 UTC m=+1116.718692380" observedRunningTime="2025-10-02 16:36:34.370506239 +0000 UTC m=+1153.119735766" watchObservedRunningTime="2025-10-02 16:36:34.371605547 +0000 UTC m=+1153.120835074" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.402336 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59775c759f-z44vx" podStartSLOduration=21.402313594 podStartE2EDuration="21.402313594s" podCreationTimestamp="2025-10-02 16:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:36:34.392779853 +0000 UTC m=+1153.142009390" watchObservedRunningTime="2025-10-02 16:36:34.402313594 +0000 UTC m=+1153.151543121" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.417433 4882 scope.go:117] "RemoveContainer" containerID="a7001c126d25ed0bf1e0520e64bfa95701219ee1571177a99e77a2b45034cc2f" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.457830 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.458742 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.493749 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84d58dc6cf-bj9kt"] Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.500346 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84d58dc6cf-bj9kt"] Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.517591 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.527823 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d8b9bcdd7-lprvx"] Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.535529 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d8b9bcdd7-lprvx"] Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.728145 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c9ca-account-create-tgl7f" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.774516 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a102fad-3c46-4428-9c16-b2dcf62c9cf1" path="/var/lib/kubelet/pods/1a102fad-3c46-4428-9c16-b2dcf62c9cf1/volumes" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.776520 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" path="/var/lib/kubelet/pods/25b3f44c-4a87-4e94-b7f3-a7f07752ea6f/volumes" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.813861 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx4w9\" (UniqueName: \"kubernetes.io/projected/29c3b95d-74fa-4e1d-a3ed-422750068a7d-kube-api-access-sx4w9\") pod \"29c3b95d-74fa-4e1d-a3ed-422750068a7d\" (UID: \"29c3b95d-74fa-4e1d-a3ed-422750068a7d\") " Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.818377 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c3b95d-74fa-4e1d-a3ed-422750068a7d-kube-api-access-sx4w9" (OuterVolumeSpecName: "kube-api-access-sx4w9") pod "29c3b95d-74fa-4e1d-a3ed-422750068a7d" (UID: "29c3b95d-74fa-4e1d-a3ed-422750068a7d"). InnerVolumeSpecName "kube-api-access-sx4w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:34 crc kubenswrapper[4882]: I1002 16:36:34.919803 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx4w9\" (UniqueName: \"kubernetes.io/projected/29c3b95d-74fa-4e1d-a3ed-422750068a7d-kube-api-access-sx4w9\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.364742 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c9ca-account-create-tgl7f" event={"ID":"29c3b95d-74fa-4e1d-a3ed-422750068a7d","Type":"ContainerDied","Data":"800e8d0e177d6d5551e0116f3c345b5bef208981c505bfd8badfb3c58356a9e7"} Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.364783 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800e8d0e177d6d5551e0116f3c345b5bef208981c505bfd8badfb3c58356a9e7" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.364919 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c9ca-account-create-tgl7f" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.415404 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.604191 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 16:36:35 crc kubenswrapper[4882]: E1002 16:36:35.604598 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" containerName="init" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.604620 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" containerName="init" Oct 02 16:36:35 crc kubenswrapper[4882]: E1002 16:36:35.604636 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a102fad-3c46-4428-9c16-b2dcf62c9cf1" containerName="init" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.604645 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a102fad-3c46-4428-9c16-b2dcf62c9cf1" containerName="init" Oct 02 16:36:35 crc kubenswrapper[4882]: E1002 16:36:35.604670 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c3b95d-74fa-4e1d-a3ed-422750068a7d" containerName="mariadb-account-create" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.604679 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c3b95d-74fa-4e1d-a3ed-422750068a7d" containerName="mariadb-account-create" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.604867 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a102fad-3c46-4428-9c16-b2dcf62c9cf1" containerName="init" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.604896 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c3b95d-74fa-4e1d-a3ed-422750068a7d" containerName="mariadb-account-create" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.604910 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b3f44c-4a87-4e94-b7f3-a7f07752ea6f" containerName="init" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.605955 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.613374 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.613529 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.613645 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.613833 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ldsqt" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.619851 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.733005 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5nk\" (UniqueName: \"kubernetes.io/projected/05fb59c5-aa61-4ec3-866f-3a4551737f80-kube-api-access-xs5nk\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.733073 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-config\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.733123 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.733160 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.733190 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.733261 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.733347 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-scripts\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.835229 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.835360 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-scripts\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.835400 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5nk\" (UniqueName: \"kubernetes.io/projected/05fb59c5-aa61-4ec3-866f-3a4551737f80-kube-api-access-xs5nk\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.835443 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-config\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.835485 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.835521 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.835550 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.836264 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.836450 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-scripts\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.836648 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-config\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.840620 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.841247 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.841846 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.869089 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5nk\" (UniqueName: \"kubernetes.io/projected/05fb59c5-aa61-4ec3-866f-3a4551737f80-kube-api-access-xs5nk\") pod \"ovn-northd-0\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " pod="openstack/ovn-northd-0" Oct 02 16:36:35 crc kubenswrapper[4882]: I1002 16:36:35.929800 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 16:36:36 crc kubenswrapper[4882]: I1002 16:36:36.422910 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 16:36:36 crc kubenswrapper[4882]: W1002 16:36:36.425312 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05fb59c5_aa61_4ec3_866f_3a4551737f80.slice/crio-d69b7725b81aab41803daf13609b395e712fe9ae5c1d63f02a9c22bbf377d26e WatchSource:0}: Error finding container d69b7725b81aab41803daf13609b395e712fe9ae5c1d63f02a9c22bbf377d26e: Status 404 returned error can't find the container with id d69b7725b81aab41803daf13609b395e712fe9ae5c1d63f02a9c22bbf377d26e Oct 02 16:36:36 crc kubenswrapper[4882]: I1002 16:36:36.561044 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-v7dcx" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerName="ovn-controller" probeResult="failure" output=< Oct 02 16:36:36 crc kubenswrapper[4882]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 16:36:36 crc kubenswrapper[4882]: > Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.061746 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8kjnt"] Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.063125 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.066410 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.080207 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b2q2c" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.085048 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8kjnt"] Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.160136 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxh2v\" (UniqueName: \"kubernetes.io/projected/6af4887d-ad2c-42e6-a473-88947a33d7cd-kube-api-access-wxh2v\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.160274 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-config-data\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.160305 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-combined-ca-bundle\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.160345 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-db-sync-config-data\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.261948 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-config-data\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.262003 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-combined-ca-bundle\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.262055 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-db-sync-config-data\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.262130 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxh2v\" (UniqueName: \"kubernetes.io/projected/6af4887d-ad2c-42e6-a473-88947a33d7cd-kube-api-access-wxh2v\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.268840 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-combined-ca-bundle\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.269732 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-db-sync-config-data\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.270796 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-config-data\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.280503 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxh2v\" (UniqueName: \"kubernetes.io/projected/6af4887d-ad2c-42e6-a473-88947a33d7cd-kube-api-access-wxh2v\") pod \"glance-db-sync-8kjnt\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.388091 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"05fb59c5-aa61-4ec3-866f-3a4551737f80","Type":"ContainerStarted","Data":"d69b7725b81aab41803daf13609b395e712fe9ae5c1d63f02a9c22bbf377d26e"} Oct 02 16:36:37 crc kubenswrapper[4882]: I1002 16:36:37.388359 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8kjnt" Oct 02 16:36:38 crc kubenswrapper[4882]: I1002 16:36:38.442543 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8kjnt"] Oct 02 16:36:38 crc kubenswrapper[4882]: I1002 16:36:38.691411 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:36:38 crc kubenswrapper[4882]: I1002 16:36:38.774648 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-2kf5r"] Oct 02 16:36:38 crc kubenswrapper[4882]: I1002 16:36:38.775044 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" podUID="9211fb00-17ac-4591-9058-6de64e3ef7ba" containerName="dnsmasq-dns" containerID="cri-o://d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8" gracePeriod=10 Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.305992 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.390864 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.390934 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.407764 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-dns-svc\") pod \"9211fb00-17ac-4591-9058-6de64e3ef7ba\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.408228 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-config\") pod \"9211fb00-17ac-4591-9058-6de64e3ef7ba\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.408355 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj42z\" (UniqueName: \"kubernetes.io/projected/9211fb00-17ac-4591-9058-6de64e3ef7ba-kube-api-access-fj42z\") pod \"9211fb00-17ac-4591-9058-6de64e3ef7ba\" (UID: \"9211fb00-17ac-4591-9058-6de64e3ef7ba\") " Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.444745 4882 generic.go:334] "Generic (PLEG): container finished" podID="9211fb00-17ac-4591-9058-6de64e3ef7ba" containerID="d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8" exitCode=0 Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.444895 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" event={"ID":"9211fb00-17ac-4591-9058-6de64e3ef7ba","Type":"ContainerDied","Data":"d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8"} Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.444940 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" event={"ID":"9211fb00-17ac-4591-9058-6de64e3ef7ba","Type":"ContainerDied","Data":"c3b0c6868d7d795ecd941a5c19e8d6c7de14510ddfea102a30f113a684b1fe16"} Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.444963 4882 scope.go:117] "RemoveContainer" containerID="d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.445158 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-2kf5r" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.461728 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9211fb00-17ac-4591-9058-6de64e3ef7ba-kube-api-access-fj42z" (OuterVolumeSpecName: "kube-api-access-fj42z") pod "9211fb00-17ac-4591-9058-6de64e3ef7ba" (UID: "9211fb00-17ac-4591-9058-6de64e3ef7ba"). InnerVolumeSpecName "kube-api-access-fj42z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.473636 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"05fb59c5-aa61-4ec3-866f-3a4551737f80","Type":"ContainerStarted","Data":"6f83cafdb181c103837b92a2bd78dedec85779409ecdc413b2e067e64c5babcf"} Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.473702 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"05fb59c5-aa61-4ec3-866f-3a4551737f80","Type":"ContainerStarted","Data":"cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4"} Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.474352 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.495199 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-config" (OuterVolumeSpecName: "config") pod "9211fb00-17ac-4591-9058-6de64e3ef7ba" (UID: "9211fb00-17ac-4591-9058-6de64e3ef7ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.508546 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.51352893 podStartE2EDuration="4.508523261s" podCreationTimestamp="2025-10-02 16:36:35 +0000 UTC" firstStartedPulling="2025-10-02 16:36:36.428668826 +0000 UTC m=+1155.177898363" lastFinishedPulling="2025-10-02 16:36:38.423663157 +0000 UTC m=+1157.172892694" observedRunningTime="2025-10-02 16:36:39.502074527 +0000 UTC m=+1158.251304054" watchObservedRunningTime="2025-10-02 16:36:39.508523261 +0000 UTC m=+1158.257752788" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.513416 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj42z\" (UniqueName: \"kubernetes.io/projected/9211fb00-17ac-4591-9058-6de64e3ef7ba-kube-api-access-fj42z\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.513470 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.513691 4882 scope.go:117] "RemoveContainer" containerID="be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.513875 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8kjnt" event={"ID":"6af4887d-ad2c-42e6-a473-88947a33d7cd","Type":"ContainerStarted","Data":"3e5aa6c13400757d73bf8fe7c4216d804d2cc7b2f26cde8e7c03978b57e84336"} Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.530842 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9211fb00-17ac-4591-9058-6de64e3ef7ba" (UID: "9211fb00-17ac-4591-9058-6de64e3ef7ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.563533 4882 scope.go:117] "RemoveContainer" containerID="d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8" Oct 02 16:36:39 crc kubenswrapper[4882]: E1002 16:36:39.565734 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8\": container with ID starting with d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8 not found: ID does not exist" containerID="d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.565795 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8"} err="failed to get container status \"d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8\": rpc error: code = NotFound desc = could not find container \"d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8\": container with ID starting with d20fd34242b9bfd812aeb9a9834ab173f8cae8d0d752bf418a22c881e7e044a8 not found: ID does not exist" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.565828 4882 scope.go:117] "RemoveContainer" containerID="be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc" Oct 02 16:36:39 crc kubenswrapper[4882]: E1002 16:36:39.566303 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc\": container with ID starting with be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc not found: ID does not exist" containerID="be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.566331 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc"} err="failed to get container status \"be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc\": rpc error: code = NotFound desc = could not find container \"be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc\": container with ID starting with be84af625358a3ca309a0c74358ae57b1ac1806dbf57e33118ffa1493e9191cc not found: ID does not exist" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.614741 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9211fb00-17ac-4591-9058-6de64e3ef7ba-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.778064 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-2kf5r"] Oct 02 16:36:39 crc kubenswrapper[4882]: I1002 16:36:39.784770 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-2kf5r"] Oct 02 16:36:40 crc kubenswrapper[4882]: I1002 16:36:40.525703 4882 generic.go:334] "Generic (PLEG): container finished" podID="f3616734-9206-483f-a173-6fa0dffe1f82" containerID="36cab3f3790ff0ea86f4b5660fdfefeef5565c1d5b18cf3c3eaa32ba022e195f" exitCode=0 Oct 02 16:36:40 crc kubenswrapper[4882]: I1002 16:36:40.525792 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nx9bj" event={"ID":"f3616734-9206-483f-a173-6fa0dffe1f82","Type":"ContainerDied","Data":"36cab3f3790ff0ea86f4b5660fdfefeef5565c1d5b18cf3c3eaa32ba022e195f"} Oct 02 16:36:40 crc kubenswrapper[4882]: I1002 16:36:40.803035 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9211fb00-17ac-4591-9058-6de64e3ef7ba" path="/var/lib/kubelet/pods/9211fb00-17ac-4591-9058-6de64e3ef7ba/volumes" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.365196 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b416-account-create-2nr92"] Oct 02 16:36:41 crc kubenswrapper[4882]: E1002 16:36:41.365712 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9211fb00-17ac-4591-9058-6de64e3ef7ba" containerName="dnsmasq-dns" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.365735 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9211fb00-17ac-4591-9058-6de64e3ef7ba" containerName="dnsmasq-dns" Oct 02 16:36:41 crc kubenswrapper[4882]: E1002 16:36:41.365762 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9211fb00-17ac-4591-9058-6de64e3ef7ba" containerName="init" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.365770 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9211fb00-17ac-4591-9058-6de64e3ef7ba" containerName="init" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.365968 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9211fb00-17ac-4591-9058-6de64e3ef7ba" containerName="dnsmasq-dns" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.366681 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b416-account-create-2nr92" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.369522 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.379906 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b416-account-create-2nr92"] Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.447754 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzpsg\" (UniqueName: \"kubernetes.io/projected/09b9fc00-b554-4426-ba0e-7bcbef667297-kube-api-access-rzpsg\") pod \"keystone-b416-account-create-2nr92\" (UID: \"09b9fc00-b554-4426-ba0e-7bcbef667297\") " pod="openstack/keystone-b416-account-create-2nr92" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.548868 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-52af-account-create-rvw9j"] Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.549911 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzpsg\" (UniqueName: \"kubernetes.io/projected/09b9fc00-b554-4426-ba0e-7bcbef667297-kube-api-access-rzpsg\") pod \"keystone-b416-account-create-2nr92\" (UID: \"09b9fc00-b554-4426-ba0e-7bcbef667297\") " pod="openstack/keystone-b416-account-create-2nr92" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.550105 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-52af-account-create-rvw9j" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.553629 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.567271 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-52af-account-create-rvw9j"] Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.611158 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzpsg\" (UniqueName: \"kubernetes.io/projected/09b9fc00-b554-4426-ba0e-7bcbef667297-kube-api-access-rzpsg\") pod \"keystone-b416-account-create-2nr92\" (UID: \"09b9fc00-b554-4426-ba0e-7bcbef667297\") " pod="openstack/keystone-b416-account-create-2nr92" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.624449 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-v7dcx" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerName="ovn-controller" probeResult="failure" output=< Oct 02 16:36:41 crc kubenswrapper[4882]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 16:36:41 crc kubenswrapper[4882]: > Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.653926 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlb78\" (UniqueName: \"kubernetes.io/projected/41dfd05c-8ce6-4327-978f-1e936b530b16-kube-api-access-qlb78\") pod \"placement-52af-account-create-rvw9j\" (UID: \"41dfd05c-8ce6-4327-978f-1e936b530b16\") " pod="openstack/placement-52af-account-create-rvw9j" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.674975 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.685651 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b416-account-create-2nr92" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.698951 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.758591 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlb78\" (UniqueName: \"kubernetes.io/projected/41dfd05c-8ce6-4327-978f-1e936b530b16-kube-api-access-qlb78\") pod \"placement-52af-account-create-rvw9j\" (UID: \"41dfd05c-8ce6-4327-978f-1e936b530b16\") " pod="openstack/placement-52af-account-create-rvw9j" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.782090 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlb78\" (UniqueName: \"kubernetes.io/projected/41dfd05c-8ce6-4327-978f-1e936b530b16-kube-api-access-qlb78\") pod \"placement-52af-account-create-rvw9j\" (UID: \"41dfd05c-8ce6-4327-978f-1e936b530b16\") " pod="openstack/placement-52af-account-create-rvw9j" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.888744 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-52af-account-create-rvw9j" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.971104 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v7dcx-config-4fx6t"] Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.972640 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:41 crc kubenswrapper[4882]: I1002 16:36:41.980838 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.003934 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v7dcx-config-4fx6t"] Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.066405 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-additional-scripts\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.066472 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-log-ovn\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.066519 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-scripts\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.066547 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.066567 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcs4m\" (UniqueName: \"kubernetes.io/projected/96c88c79-2285-4496-af8a-6473f2469577-kube-api-access-vcs4m\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.066614 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run-ovn\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.126887 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.169175 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-scripts\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.169294 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.169334 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcs4m\" (UniqueName: \"kubernetes.io/projected/96c88c79-2285-4496-af8a-6473f2469577-kube-api-access-vcs4m\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.170642 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run-ovn\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.170731 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.170862 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-additional-scripts\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.170985 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-log-ovn\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.171211 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-log-ovn\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.170866 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run-ovn\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.171962 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-additional-scripts\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.172432 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-scripts\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.189933 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b416-account-create-2nr92"] Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.193609 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcs4m\" (UniqueName: \"kubernetes.io/projected/96c88c79-2285-4496-af8a-6473f2469577-kube-api-access-vcs4m\") pod \"ovn-controller-v7dcx-config-4fx6t\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.273498 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-combined-ca-bundle\") pod \"f3616734-9206-483f-a173-6fa0dffe1f82\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.273649 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-swiftconf\") pod \"f3616734-9206-483f-a173-6fa0dffe1f82\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.273686 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3616734-9206-483f-a173-6fa0dffe1f82-etc-swift\") pod \"f3616734-9206-483f-a173-6fa0dffe1f82\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.273721 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-ring-data-devices\") pod \"f3616734-9206-483f-a173-6fa0dffe1f82\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.273830 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-dispersionconf\") pod \"f3616734-9206-483f-a173-6fa0dffe1f82\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.273876 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtn62\" (UniqueName: \"kubernetes.io/projected/f3616734-9206-483f-a173-6fa0dffe1f82-kube-api-access-mtn62\") pod \"f3616734-9206-483f-a173-6fa0dffe1f82\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.273914 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-scripts\") pod \"f3616734-9206-483f-a173-6fa0dffe1f82\" (UID: \"f3616734-9206-483f-a173-6fa0dffe1f82\") " Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.277640 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f3616734-9206-483f-a173-6fa0dffe1f82" (UID: "f3616734-9206-483f-a173-6fa0dffe1f82"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.279905 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3616734-9206-483f-a173-6fa0dffe1f82-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f3616734-9206-483f-a173-6fa0dffe1f82" (UID: "f3616734-9206-483f-a173-6fa0dffe1f82"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.294915 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3616734-9206-483f-a173-6fa0dffe1f82-kube-api-access-mtn62" (OuterVolumeSpecName: "kube-api-access-mtn62") pod "f3616734-9206-483f-a173-6fa0dffe1f82" (UID: "f3616734-9206-483f-a173-6fa0dffe1f82"). InnerVolumeSpecName "kube-api-access-mtn62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.301041 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f3616734-9206-483f-a173-6fa0dffe1f82" (UID: "f3616734-9206-483f-a173-6fa0dffe1f82"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.311959 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-scripts" (OuterVolumeSpecName: "scripts") pod "f3616734-9206-483f-a173-6fa0dffe1f82" (UID: "f3616734-9206-483f-a173-6fa0dffe1f82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.320884 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.346925 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3616734-9206-483f-a173-6fa0dffe1f82" (UID: "f3616734-9206-483f-a173-6fa0dffe1f82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.354394 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f3616734-9206-483f-a173-6fa0dffe1f82" (UID: "f3616734-9206-483f-a173-6fa0dffe1f82"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.375609 4882 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.375650 4882 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f3616734-9206-483f-a173-6fa0dffe1f82-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.375663 4882 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.375678 4882 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.375693 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtn62\" (UniqueName: \"kubernetes.io/projected/f3616734-9206-483f-a173-6fa0dffe1f82-kube-api-access-mtn62\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.375709 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3616734-9206-483f-a173-6fa0dffe1f82-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.375722 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3616734-9206-483f-a173-6fa0dffe1f82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.409296 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-52af-account-create-rvw9j"] Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.554962 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nx9bj" event={"ID":"f3616734-9206-483f-a173-6fa0dffe1f82","Type":"ContainerDied","Data":"5d2236cd256202f026477b5e3628419454cd5c9d4d13f8ea65cc533bd29867f9"} Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.555029 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d2236cd256202f026477b5e3628419454cd5c9d4d13f8ea65cc533bd29867f9" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.555109 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nx9bj" Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.559132 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-52af-account-create-rvw9j" event={"ID":"41dfd05c-8ce6-4327-978f-1e936b530b16","Type":"ContainerStarted","Data":"823afb167d11a2efebfbe657185ed1dfa32211a42d3c1e2d00749fe27ce851dd"} Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.562309 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b416-account-create-2nr92" event={"ID":"09b9fc00-b554-4426-ba0e-7bcbef667297","Type":"ContainerStarted","Data":"53d3a2dd59d9fce5a2ae1df8874af4314a5c1206051323fd6c315decab3bb56c"} Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.562343 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b416-account-create-2nr92" event={"ID":"09b9fc00-b554-4426-ba0e-7bcbef667297","Type":"ContainerStarted","Data":"82670a4edaecae925013f19e4a76789115b68082a8430a6c46f6f5a564967ce5"} Oct 02 16:36:42 crc kubenswrapper[4882]: I1002 16:36:42.582394 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b416-account-create-2nr92" podStartSLOduration=1.5823738330000001 podStartE2EDuration="1.582373833s" podCreationTimestamp="2025-10-02 16:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:36:42.575981982 +0000 UTC m=+1161.325211509" watchObservedRunningTime="2025-10-02 16:36:42.582373833 +0000 UTC m=+1161.331603360" Oct 02 16:36:43 crc kubenswrapper[4882]: I1002 16:36:43.076538 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v7dcx-config-4fx6t"] Oct 02 16:36:43 crc kubenswrapper[4882]: I1002 16:36:43.595865 4882 generic.go:334] "Generic (PLEG): container finished" podID="09b9fc00-b554-4426-ba0e-7bcbef667297" containerID="53d3a2dd59d9fce5a2ae1df8874af4314a5c1206051323fd6c315decab3bb56c" exitCode=0 Oct 02 16:36:43 crc kubenswrapper[4882]: I1002 16:36:43.597004 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b416-account-create-2nr92" event={"ID":"09b9fc00-b554-4426-ba0e-7bcbef667297","Type":"ContainerDied","Data":"53d3a2dd59d9fce5a2ae1df8874af4314a5c1206051323fd6c315decab3bb56c"} Oct 02 16:36:43 crc kubenswrapper[4882]: I1002 16:36:43.601714 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx-config-4fx6t" event={"ID":"96c88c79-2285-4496-af8a-6473f2469577","Type":"ContainerStarted","Data":"2312e65cccff981a32fb124e131b2d0f9646908568d575f1181ef6833908e204"} Oct 02 16:36:43 crc kubenswrapper[4882]: I1002 16:36:43.601774 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx-config-4fx6t" event={"ID":"96c88c79-2285-4496-af8a-6473f2469577","Type":"ContainerStarted","Data":"a5a3cb022e313b3bc81b4ef909ff390c071452f3f13c148e331b668909d4f161"} Oct 02 16:36:43 crc kubenswrapper[4882]: I1002 16:36:43.609392 4882 generic.go:334] "Generic (PLEG): container finished" podID="41dfd05c-8ce6-4327-978f-1e936b530b16" containerID="bb5a7c2330538131246f478652e9a6a981806ba2ad6fa95077b86414166cb896" exitCode=0 Oct 02 16:36:43 crc kubenswrapper[4882]: I1002 16:36:43.609464 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-52af-account-create-rvw9j" event={"ID":"41dfd05c-8ce6-4327-978f-1e936b530b16","Type":"ContainerDied","Data":"bb5a7c2330538131246f478652e9a6a981806ba2ad6fa95077b86414166cb896"} Oct 02 16:36:43 crc kubenswrapper[4882]: I1002 16:36:43.670093 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v7dcx-config-4fx6t" podStartSLOduration=2.670065338 podStartE2EDuration="2.670065338s" podCreationTimestamp="2025-10-02 16:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:36:43.662463887 +0000 UTC m=+1162.411693414" watchObservedRunningTime="2025-10-02 16:36:43.670065338 +0000 UTC m=+1162.419294865" Oct 02 16:36:44 crc kubenswrapper[4882]: I1002 16:36:44.619647 4882 generic.go:334] "Generic (PLEG): container finished" podID="96c88c79-2285-4496-af8a-6473f2469577" containerID="2312e65cccff981a32fb124e131b2d0f9646908568d575f1181ef6833908e204" exitCode=0 Oct 02 16:36:44 crc kubenswrapper[4882]: I1002 16:36:44.619728 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx-config-4fx6t" event={"ID":"96c88c79-2285-4496-af8a-6473f2469577","Type":"ContainerDied","Data":"2312e65cccff981a32fb124e131b2d0f9646908568d575f1181ef6833908e204"} Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.027719 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-52af-account-create-rvw9j" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.034574 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b416-account-create-2nr92" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.176061 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlb78\" (UniqueName: \"kubernetes.io/projected/41dfd05c-8ce6-4327-978f-1e936b530b16-kube-api-access-qlb78\") pod \"41dfd05c-8ce6-4327-978f-1e936b530b16\" (UID: \"41dfd05c-8ce6-4327-978f-1e936b530b16\") " Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.176196 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzpsg\" (UniqueName: \"kubernetes.io/projected/09b9fc00-b554-4426-ba0e-7bcbef667297-kube-api-access-rzpsg\") pod \"09b9fc00-b554-4426-ba0e-7bcbef667297\" (UID: \"09b9fc00-b554-4426-ba0e-7bcbef667297\") " Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.189718 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41dfd05c-8ce6-4327-978f-1e936b530b16-kube-api-access-qlb78" (OuterVolumeSpecName: "kube-api-access-qlb78") pod "41dfd05c-8ce6-4327-978f-1e936b530b16" (UID: "41dfd05c-8ce6-4327-978f-1e936b530b16"). InnerVolumeSpecName "kube-api-access-qlb78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.189795 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b9fc00-b554-4426-ba0e-7bcbef667297-kube-api-access-rzpsg" (OuterVolumeSpecName: "kube-api-access-rzpsg") pod "09b9fc00-b554-4426-ba0e-7bcbef667297" (UID: "09b9fc00-b554-4426-ba0e-7bcbef667297"). InnerVolumeSpecName "kube-api-access-rzpsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.279027 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzpsg\" (UniqueName: \"kubernetes.io/projected/09b9fc00-b554-4426-ba0e-7bcbef667297-kube-api-access-rzpsg\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.279073 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlb78\" (UniqueName: \"kubernetes.io/projected/41dfd05c-8ce6-4327-978f-1e936b530b16-kube-api-access-qlb78\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.631840 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b416-account-create-2nr92" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.631853 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b416-account-create-2nr92" event={"ID":"09b9fc00-b554-4426-ba0e-7bcbef667297","Type":"ContainerDied","Data":"82670a4edaecae925013f19e4a76789115b68082a8430a6c46f6f5a564967ce5"} Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.631885 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82670a4edaecae925013f19e4a76789115b68082a8430a6c46f6f5a564967ce5" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.632925 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-52af-account-create-rvw9j" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.632953 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-52af-account-create-rvw9j" event={"ID":"41dfd05c-8ce6-4327-978f-1e936b530b16","Type":"ContainerDied","Data":"823afb167d11a2efebfbe657185ed1dfa32211a42d3c1e2d00749fe27ce851dd"} Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.633016 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823afb167d11a2efebfbe657185ed1dfa32211a42d3c1e2d00749fe27ce851dd" Oct 02 16:36:45 crc kubenswrapper[4882]: I1002 16:36:45.906132 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.091319 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run-ovn\") pod \"96c88c79-2285-4496-af8a-6473f2469577\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.091407 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcs4m\" (UniqueName: \"kubernetes.io/projected/96c88c79-2285-4496-af8a-6473f2469577-kube-api-access-vcs4m\") pod \"96c88c79-2285-4496-af8a-6473f2469577\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.091442 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run\") pod \"96c88c79-2285-4496-af8a-6473f2469577\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.091485 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-scripts\") pod \"96c88c79-2285-4496-af8a-6473f2469577\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.091511 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-log-ovn\") pod \"96c88c79-2285-4496-af8a-6473f2469577\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.091552 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-additional-scripts\") pod \"96c88c79-2285-4496-af8a-6473f2469577\" (UID: \"96c88c79-2285-4496-af8a-6473f2469577\") " Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.091557 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "96c88c79-2285-4496-af8a-6473f2469577" (UID: "96c88c79-2285-4496-af8a-6473f2469577"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.092342 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "96c88c79-2285-4496-af8a-6473f2469577" (UID: "96c88c79-2285-4496-af8a-6473f2469577"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.092373 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "96c88c79-2285-4496-af8a-6473f2469577" (UID: "96c88c79-2285-4496-af8a-6473f2469577"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.092394 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run" (OuterVolumeSpecName: "var-run") pod "96c88c79-2285-4496-af8a-6473f2469577" (UID: "96c88c79-2285-4496-af8a-6473f2469577"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.092671 4882 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.092682 4882 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.092690 4882 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/96c88c79-2285-4496-af8a-6473f2469577-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.092701 4882 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.093926 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-scripts" (OuterVolumeSpecName: "scripts") pod "96c88c79-2285-4496-af8a-6473f2469577" (UID: "96c88c79-2285-4496-af8a-6473f2469577"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.098779 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c88c79-2285-4496-af8a-6473f2469577-kube-api-access-vcs4m" (OuterVolumeSpecName: "kube-api-access-vcs4m") pod "96c88c79-2285-4496-af8a-6473f2469577" (UID: "96c88c79-2285-4496-af8a-6473f2469577"). InnerVolumeSpecName "kube-api-access-vcs4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.171625 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v7dcx-config-4fx6t"] Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.178181 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v7dcx-config-4fx6t"] Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.194330 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcs4m\" (UniqueName: \"kubernetes.io/projected/96c88c79-2285-4496-af8a-6473f2469577-kube-api-access-vcs4m\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.194372 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96c88c79-2285-4496-af8a-6473f2469577-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.272255 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v7dcx-config-qzrjm"] Oct 02 16:36:46 crc kubenswrapper[4882]: E1002 16:36:46.272978 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3616734-9206-483f-a173-6fa0dffe1f82" containerName="swift-ring-rebalance" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.273002 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3616734-9206-483f-a173-6fa0dffe1f82" containerName="swift-ring-rebalance" Oct 02 16:36:46 crc kubenswrapper[4882]: E1002 16:36:46.273042 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b9fc00-b554-4426-ba0e-7bcbef667297" containerName="mariadb-account-create" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.273052 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b9fc00-b554-4426-ba0e-7bcbef667297" containerName="mariadb-account-create" Oct 02 16:36:46 crc kubenswrapper[4882]: E1002 16:36:46.273071 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c88c79-2285-4496-af8a-6473f2469577" containerName="ovn-config" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.273079 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c88c79-2285-4496-af8a-6473f2469577" containerName="ovn-config" Oct 02 16:36:46 crc kubenswrapper[4882]: E1002 16:36:46.273115 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41dfd05c-8ce6-4327-978f-1e936b530b16" containerName="mariadb-account-create" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.273123 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="41dfd05c-8ce6-4327-978f-1e936b530b16" containerName="mariadb-account-create" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.273580 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3616734-9206-483f-a173-6fa0dffe1f82" containerName="swift-ring-rebalance" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.273619 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="41dfd05c-8ce6-4327-978f-1e936b530b16" containerName="mariadb-account-create" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.273639 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b9fc00-b554-4426-ba0e-7bcbef667297" containerName="mariadb-account-create" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.273665 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c88c79-2285-4496-af8a-6473f2469577" containerName="ovn-config" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.274699 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.299562 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v7dcx-config-qzrjm"] Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.299918 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-additional-scripts\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.299989 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-log-ovn\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.300051 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run-ovn\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.300073 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.300146 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghh2\" (UniqueName: \"kubernetes.io/projected/b1576281-8163-4918-82e6-ff3529fa9702-kube-api-access-bghh2\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.300179 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-scripts\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.401320 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run-ovn\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.401370 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.401408 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bghh2\" (UniqueName: \"kubernetes.io/projected/b1576281-8163-4918-82e6-ff3529fa9702-kube-api-access-bghh2\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.401442 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-scripts\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.401475 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-additional-scripts\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.401512 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-log-ovn\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.401890 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-log-ovn\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.402117 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run-ovn\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.402234 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.402599 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-additional-scripts\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.405547 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-scripts\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.418117 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghh2\" (UniqueName: \"kubernetes.io/projected/b1576281-8163-4918-82e6-ff3529fa9702-kube-api-access-bghh2\") pod \"ovn-controller-v7dcx-config-qzrjm\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.502768 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.506832 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"swift-storage-0\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " pod="openstack/swift-storage-0" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.561345 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-v7dcx" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.612914 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.643540 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a3cb022e313b3bc81b4ef909ff390c071452f3f13c148e331b668909d4f161" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.643615 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx-config-4fx6t" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.652820 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 16:36:46 crc kubenswrapper[4882]: I1002 16:36:46.772865 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c88c79-2285-4496-af8a-6473f2469577" path="/var/lib/kubelet/pods/96c88c79-2285-4496-af8a-6473f2469577/volumes" Oct 02 16:36:48 crc kubenswrapper[4882]: I1002 16:36:48.877369 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:36:48 crc kubenswrapper[4882]: I1002 16:36:48.947070 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.643524 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h77k7"] Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.645298 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h77k7" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.655151 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h77k7"] Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.729041 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-89xpx"] Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.730515 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-89xpx" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.743411 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-89xpx"] Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.771602 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mwdk\" (UniqueName: \"kubernetes.io/projected/caf4780a-ca9d-46f8-8295-50e6154f70aa-kube-api-access-5mwdk\") pod \"cinder-db-create-h77k7\" (UID: \"caf4780a-ca9d-46f8-8295-50e6154f70aa\") " pod="openstack/cinder-db-create-h77k7" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.872844 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvskf\" (UniqueName: \"kubernetes.io/projected/4991e502-56d0-483d-b0cb-8c5d2e3d9282-kube-api-access-hvskf\") pod \"barbican-db-create-89xpx\" (UID: \"4991e502-56d0-483d-b0cb-8c5d2e3d9282\") " pod="openstack/barbican-db-create-89xpx" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.872932 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mwdk\" (UniqueName: \"kubernetes.io/projected/caf4780a-ca9d-46f8-8295-50e6154f70aa-kube-api-access-5mwdk\") pod \"cinder-db-create-h77k7\" (UID: \"caf4780a-ca9d-46f8-8295-50e6154f70aa\") " pod="openstack/cinder-db-create-h77k7" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.890568 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mwdk\" (UniqueName: \"kubernetes.io/projected/caf4780a-ca9d-46f8-8295-50e6154f70aa-kube-api-access-5mwdk\") pod \"cinder-db-create-h77k7\" (UID: \"caf4780a-ca9d-46f8-8295-50e6154f70aa\") " pod="openstack/cinder-db-create-h77k7" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.933766 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-pdk2q"] Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.942896 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-pdk2q"] Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.943014 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pdk2q" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.974190 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h77k7" Oct 02 16:36:50 crc kubenswrapper[4882]: I1002 16:36:50.974607 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvskf\" (UniqueName: \"kubernetes.io/projected/4991e502-56d0-483d-b0cb-8c5d2e3d9282-kube-api-access-hvskf\") pod \"barbican-db-create-89xpx\" (UID: \"4991e502-56d0-483d-b0cb-8c5d2e3d9282\") " pod="openstack/barbican-db-create-89xpx" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.001325 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8x4zn"] Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.002825 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.007645 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvskf\" (UniqueName: \"kubernetes.io/projected/4991e502-56d0-483d-b0cb-8c5d2e3d9282-kube-api-access-hvskf\") pod \"barbican-db-create-89xpx\" (UID: \"4991e502-56d0-483d-b0cb-8c5d2e3d9282\") " pod="openstack/barbican-db-create-89xpx" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.012690 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8x4zn"] Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.013308 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.013437 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.013712 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.013949 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q8cq5" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.024826 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.048320 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-89xpx" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.076125 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5r2c\" (UniqueName: \"kubernetes.io/projected/9b09392d-ac00-4006-a716-d489b76594ed-kube-api-access-p5r2c\") pod \"neutron-db-create-pdk2q\" (UID: \"9b09392d-ac00-4006-a716-d489b76594ed\") " pod="openstack/neutron-db-create-pdk2q" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.177485 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-config-data\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.178072 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hht6\" (UniqueName: \"kubernetes.io/projected/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-kube-api-access-9hht6\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.178716 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-combined-ca-bundle\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.179023 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5r2c\" (UniqueName: \"kubernetes.io/projected/9b09392d-ac00-4006-a716-d489b76594ed-kube-api-access-p5r2c\") pod \"neutron-db-create-pdk2q\" (UID: \"9b09392d-ac00-4006-a716-d489b76594ed\") " pod="openstack/neutron-db-create-pdk2q" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.195914 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5r2c\" (UniqueName: \"kubernetes.io/projected/9b09392d-ac00-4006-a716-d489b76594ed-kube-api-access-p5r2c\") pod \"neutron-db-create-pdk2q\" (UID: \"9b09392d-ac00-4006-a716-d489b76594ed\") " pod="openstack/neutron-db-create-pdk2q" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.269116 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pdk2q" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.280984 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-config-data\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.281106 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hht6\" (UniqueName: \"kubernetes.io/projected/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-kube-api-access-9hht6\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.281142 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-combined-ca-bundle\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.285110 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-combined-ca-bundle\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.285762 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-config-data\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.305568 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hht6\" (UniqueName: \"kubernetes.io/projected/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-kube-api-access-9hht6\") pod \"keystone-db-sync-8x4zn\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:51 crc kubenswrapper[4882]: I1002 16:36:51.354533 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.245115 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h77k7"] Oct 02 16:36:55 crc kubenswrapper[4882]: W1002 16:36:55.251555 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1576281_8163_4918_82e6_ff3529fa9702.slice/crio-82d4a30b3d3a9be21a28faca9026a80cb62e6367fdc2aeec92c3ba7f7302619d WatchSource:0}: Error finding container 82d4a30b3d3a9be21a28faca9026a80cb62e6367fdc2aeec92c3ba7f7302619d: Status 404 returned error can't find the container with id 82d4a30b3d3a9be21a28faca9026a80cb62e6367fdc2aeec92c3ba7f7302619d Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.259401 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v7dcx-config-qzrjm"] Oct 02 16:36:55 crc kubenswrapper[4882]: W1002 16:36:55.260410 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b09392d_ac00_4006_a716_d489b76594ed.slice/crio-47b8588c5d2f8b44c9d8dac1c27ec1981d9ab0242a2a3fcbacee7bb6097a9f86 WatchSource:0}: Error finding container 47b8588c5d2f8b44c9d8dac1c27ec1981d9ab0242a2a3fcbacee7bb6097a9f86: Status 404 returned error can't find the container with id 47b8588c5d2f8b44c9d8dac1c27ec1981d9ab0242a2a3fcbacee7bb6097a9f86 Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.265518 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-pdk2q"] Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.384532 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8x4zn"] Oct 02 16:36:55 crc kubenswrapper[4882]: W1002 16:36:55.391464 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10d444aa_bebd_4e21_bff0_c6ad1be5fc18.slice/crio-f13ca12a1f433b90bea7a49b312b006ffde89cb9e5508cc7d1070a538010ade1 WatchSource:0}: Error finding container f13ca12a1f433b90bea7a49b312b006ffde89cb9e5508cc7d1070a538010ade1: Status 404 returned error can't find the container with id f13ca12a1f433b90bea7a49b312b006ffde89cb9e5508cc7d1070a538010ade1 Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.465081 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 16:36:55 crc kubenswrapper[4882]: W1002 16:36:55.477391 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd26acb_1d48_48f3_b39d_b274bdcd3cce.slice/crio-ce8731e432f4f67124e22949151742a3ebe8662b136718c00dba885eb4640bf1 WatchSource:0}: Error finding container ce8731e432f4f67124e22949151742a3ebe8662b136718c00dba885eb4640bf1: Status 404 returned error can't find the container with id ce8731e432f4f67124e22949151742a3ebe8662b136718c00dba885eb4640bf1 Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.495088 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-89xpx"] Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.739521 4882 generic.go:334] "Generic (PLEG): container finished" podID="9b09392d-ac00-4006-a716-d489b76594ed" containerID="c7c2d8087bd15225cd977111ec94e1a88249d610e4918662bc600166c40c96a6" exitCode=0 Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.739631 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pdk2q" event={"ID":"9b09392d-ac00-4006-a716-d489b76594ed","Type":"ContainerDied","Data":"c7c2d8087bd15225cd977111ec94e1a88249d610e4918662bc600166c40c96a6"} Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.739667 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pdk2q" event={"ID":"9b09392d-ac00-4006-a716-d489b76594ed","Type":"ContainerStarted","Data":"47b8588c5d2f8b44c9d8dac1c27ec1981d9ab0242a2a3fcbacee7bb6097a9f86"} Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.745438 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8kjnt" event={"ID":"6af4887d-ad2c-42e6-a473-88947a33d7cd","Type":"ContainerStarted","Data":"d567389cb25bbabc1546297bf134bd90f58b2f6f44e328db815cbb62df15308f"} Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.747624 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-89xpx" event={"ID":"4991e502-56d0-483d-b0cb-8c5d2e3d9282","Type":"ContainerStarted","Data":"18d1e1bbb633f45a1ce945f9549eaade1c58f0bd86886f9acff3406ccd8ee5fa"} Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.749586 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8x4zn" event={"ID":"10d444aa-bebd-4e21-bff0-c6ad1be5fc18","Type":"ContainerStarted","Data":"f13ca12a1f433b90bea7a49b312b006ffde89cb9e5508cc7d1070a538010ade1"} Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.755123 4882 generic.go:334] "Generic (PLEG): container finished" podID="caf4780a-ca9d-46f8-8295-50e6154f70aa" containerID="e797ff472429769a19e407c394af10ec378417f006ab301847f9d97baf78cf3a" exitCode=0 Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.755271 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h77k7" event={"ID":"caf4780a-ca9d-46f8-8295-50e6154f70aa","Type":"ContainerDied","Data":"e797ff472429769a19e407c394af10ec378417f006ab301847f9d97baf78cf3a"} Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.755308 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h77k7" event={"ID":"caf4780a-ca9d-46f8-8295-50e6154f70aa","Type":"ContainerStarted","Data":"2315c4dd09645df69da5d138e1f81d6d332681646a4f1c54c42ae7a212675f1c"} Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.792795 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8kjnt" podStartSLOduration=2.516774189 podStartE2EDuration="18.792771579s" podCreationTimestamp="2025-10-02 16:36:37 +0000 UTC" firstStartedPulling="2025-10-02 16:36:38.451648515 +0000 UTC m=+1157.200878042" lastFinishedPulling="2025-10-02 16:36:54.727645895 +0000 UTC m=+1173.476875432" observedRunningTime="2025-10-02 16:36:55.783874135 +0000 UTC m=+1174.533103672" watchObservedRunningTime="2025-10-02 16:36:55.792771579 +0000 UTC m=+1174.542001106" Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.795524 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx-config-qzrjm" event={"ID":"b1576281-8163-4918-82e6-ff3529fa9702","Type":"ContainerStarted","Data":"82d4a30b3d3a9be21a28faca9026a80cb62e6367fdc2aeec92c3ba7f7302619d"} Oct 02 16:36:55 crc kubenswrapper[4882]: I1002 16:36:55.802826 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"ce8731e432f4f67124e22949151742a3ebe8662b136718c00dba885eb4640bf1"} Oct 02 16:36:56 crc kubenswrapper[4882]: I1002 16:36:56.813713 4882 generic.go:334] "Generic (PLEG): container finished" podID="4991e502-56d0-483d-b0cb-8c5d2e3d9282" containerID="fca6ccaa26b7ac77a6a8aeaaa1443eac120b8470f0a5167ae0835b65ccac8088" exitCode=0 Oct 02 16:36:56 crc kubenswrapper[4882]: I1002 16:36:56.813915 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-89xpx" event={"ID":"4991e502-56d0-483d-b0cb-8c5d2e3d9282","Type":"ContainerDied","Data":"fca6ccaa26b7ac77a6a8aeaaa1443eac120b8470f0a5167ae0835b65ccac8088"} Oct 02 16:36:56 crc kubenswrapper[4882]: I1002 16:36:56.817708 4882 generic.go:334] "Generic (PLEG): container finished" podID="b1576281-8163-4918-82e6-ff3529fa9702" containerID="a698dea8c1459df07df2dd2e5c83bea57c6d7a823b74aae6cc9b205345f7a466" exitCode=0 Oct 02 16:36:56 crc kubenswrapper[4882]: I1002 16:36:56.817778 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx-config-qzrjm" event={"ID":"b1576281-8163-4918-82e6-ff3529fa9702","Type":"ContainerDied","Data":"a698dea8c1459df07df2dd2e5c83bea57c6d7a823b74aae6cc9b205345f7a466"} Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.180337 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pdk2q" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.187565 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h77k7" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.298895 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mwdk\" (UniqueName: \"kubernetes.io/projected/caf4780a-ca9d-46f8-8295-50e6154f70aa-kube-api-access-5mwdk\") pod \"caf4780a-ca9d-46f8-8295-50e6154f70aa\" (UID: \"caf4780a-ca9d-46f8-8295-50e6154f70aa\") " Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.300026 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5r2c\" (UniqueName: \"kubernetes.io/projected/9b09392d-ac00-4006-a716-d489b76594ed-kube-api-access-p5r2c\") pod \"9b09392d-ac00-4006-a716-d489b76594ed\" (UID: \"9b09392d-ac00-4006-a716-d489b76594ed\") " Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.305833 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf4780a-ca9d-46f8-8295-50e6154f70aa-kube-api-access-5mwdk" (OuterVolumeSpecName: "kube-api-access-5mwdk") pod "caf4780a-ca9d-46f8-8295-50e6154f70aa" (UID: "caf4780a-ca9d-46f8-8295-50e6154f70aa"). InnerVolumeSpecName "kube-api-access-5mwdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.306359 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b09392d-ac00-4006-a716-d489b76594ed-kube-api-access-p5r2c" (OuterVolumeSpecName: "kube-api-access-p5r2c") pod "9b09392d-ac00-4006-a716-d489b76594ed" (UID: "9b09392d-ac00-4006-a716-d489b76594ed"). InnerVolumeSpecName "kube-api-access-p5r2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.402165 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mwdk\" (UniqueName: \"kubernetes.io/projected/caf4780a-ca9d-46f8-8295-50e6154f70aa-kube-api-access-5mwdk\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.402204 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5r2c\" (UniqueName: \"kubernetes.io/projected/9b09392d-ac00-4006-a716-d489b76594ed-kube-api-access-p5r2c\") on node \"crc\" DevicePath \"\"" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.830180 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-pdk2q" event={"ID":"9b09392d-ac00-4006-a716-d489b76594ed","Type":"ContainerDied","Data":"47b8588c5d2f8b44c9d8dac1c27ec1981d9ab0242a2a3fcbacee7bb6097a9f86"} Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.830243 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-pdk2q" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.830251 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b8588c5d2f8b44c9d8dac1c27ec1981d9ab0242a2a3fcbacee7bb6097a9f86" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.832066 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h77k7" event={"ID":"caf4780a-ca9d-46f8-8295-50e6154f70aa","Type":"ContainerDied","Data":"2315c4dd09645df69da5d138e1f81d6d332681646a4f1c54c42ae7a212675f1c"} Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.832116 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2315c4dd09645df69da5d138e1f81d6d332681646a4f1c54c42ae7a212675f1c" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.832119 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h77k7" Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.833754 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"f6dd5331866217f7c3fd5cb62ed15451551f9d0a3146c188905b3c00fb8cb139"} Oct 02 16:36:57 crc kubenswrapper[4882]: I1002 16:36:57.833778 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"d8caa8721d90f9ef2df17eeeff1e9b45a19acf29fff7c7ee1e067ac7a621cebe"} Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.780116 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.787438 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-89xpx" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.919958 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-scripts\") pod \"b1576281-8163-4918-82e6-ff3529fa9702\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.920036 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-log-ovn\") pod \"b1576281-8163-4918-82e6-ff3529fa9702\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.920107 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-additional-scripts\") pod \"b1576281-8163-4918-82e6-ff3529fa9702\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.920158 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run-ovn\") pod \"b1576281-8163-4918-82e6-ff3529fa9702\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.920197 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bghh2\" (UniqueName: \"kubernetes.io/projected/b1576281-8163-4918-82e6-ff3529fa9702-kube-api-access-bghh2\") pod \"b1576281-8163-4918-82e6-ff3529fa9702\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.920249 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run\") pod \"b1576281-8163-4918-82e6-ff3529fa9702\" (UID: \"b1576281-8163-4918-82e6-ff3529fa9702\") " Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.920291 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvskf\" (UniqueName: \"kubernetes.io/projected/4991e502-56d0-483d-b0cb-8c5d2e3d9282-kube-api-access-hvskf\") pod \"4991e502-56d0-483d-b0cb-8c5d2e3d9282\" (UID: \"4991e502-56d0-483d-b0cb-8c5d2e3d9282\") " Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.920359 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b1576281-8163-4918-82e6-ff3529fa9702" (UID: "b1576281-8163-4918-82e6-ff3529fa9702"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.920736 4882 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.921203 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b1576281-8163-4918-82e6-ff3529fa9702" (UID: "b1576281-8163-4918-82e6-ff3529fa9702"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.921555 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-scripts" (OuterVolumeSpecName: "scripts") pod "b1576281-8163-4918-82e6-ff3529fa9702" (UID: "b1576281-8163-4918-82e6-ff3529fa9702"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.921714 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run" (OuterVolumeSpecName: "var-run") pod "b1576281-8163-4918-82e6-ff3529fa9702" (UID: "b1576281-8163-4918-82e6-ff3529fa9702"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.924360 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b1576281-8163-4918-82e6-ff3529fa9702" (UID: "b1576281-8163-4918-82e6-ff3529fa9702"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.947510 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1576281-8163-4918-82e6-ff3529fa9702-kube-api-access-bghh2" (OuterVolumeSpecName: "kube-api-access-bghh2") pod "b1576281-8163-4918-82e6-ff3529fa9702" (UID: "b1576281-8163-4918-82e6-ff3529fa9702"). InnerVolumeSpecName "kube-api-access-bghh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.959451 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4991e502-56d0-483d-b0cb-8c5d2e3d9282-kube-api-access-hvskf" (OuterVolumeSpecName: "kube-api-access-hvskf") pod "4991e502-56d0-483d-b0cb-8c5d2e3d9282" (UID: "4991e502-56d0-483d-b0cb-8c5d2e3d9282"). InnerVolumeSpecName "kube-api-access-hvskf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.961854 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx-config-qzrjm" event={"ID":"b1576281-8163-4918-82e6-ff3529fa9702","Type":"ContainerDied","Data":"82d4a30b3d3a9be21a28faca9026a80cb62e6367fdc2aeec92c3ba7f7302619d"} Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.961909 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82d4a30b3d3a9be21a28faca9026a80cb62e6367fdc2aeec92c3ba7f7302619d" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.962006 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx-config-qzrjm" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.989707 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-89xpx" event={"ID":"4991e502-56d0-483d-b0cb-8c5d2e3d9282","Type":"ContainerDied","Data":"18d1e1bbb633f45a1ce945f9549eaade1c58f0bd86886f9acff3406ccd8ee5fa"} Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.989922 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d1e1bbb633f45a1ce945f9549eaade1c58f0bd86886f9acff3406ccd8ee5fa" Oct 02 16:37:02 crc kubenswrapper[4882]: I1002 16:37:02.990238 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-89xpx" Oct 02 16:37:03 crc kubenswrapper[4882]: I1002 16:37:03.023567 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bghh2\" (UniqueName: \"kubernetes.io/projected/b1576281-8163-4918-82e6-ff3529fa9702-kube-api-access-bghh2\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:03 crc kubenswrapper[4882]: I1002 16:37:03.023636 4882 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:03 crc kubenswrapper[4882]: I1002 16:37:03.023646 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvskf\" (UniqueName: \"kubernetes.io/projected/4991e502-56d0-483d-b0cb-8c5d2e3d9282-kube-api-access-hvskf\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:03 crc kubenswrapper[4882]: I1002 16:37:03.023663 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:03 crc kubenswrapper[4882]: I1002 16:37:03.023672 4882 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1576281-8163-4918-82e6-ff3529fa9702-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:03 crc kubenswrapper[4882]: I1002 16:37:03.023681 4882 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1576281-8163-4918-82e6-ff3529fa9702-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:04 crc kubenswrapper[4882]: I1002 16:37:04.015659 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"92d1d3fc20a44da92843f516fd7e287b2820813b7b752be600e88b83707ea9d9"} Oct 02 16:37:04 crc kubenswrapper[4882]: I1002 16:37:04.016373 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"19db6cdfa5b1c6ddeac4c3af0c7dac82506480145ee4d50af2f88dd3e7515251"} Oct 02 16:37:04 crc kubenswrapper[4882]: I1002 16:37:04.055757 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v7dcx-config-qzrjm"] Oct 02 16:37:04 crc kubenswrapper[4882]: I1002 16:37:04.061992 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v7dcx-config-qzrjm"] Oct 02 16:37:04 crc kubenswrapper[4882]: I1002 16:37:04.771888 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1576281-8163-4918-82e6-ff3529fa9702" path="/var/lib/kubelet/pods/b1576281-8163-4918-82e6-ff3529fa9702/volumes" Oct 02 16:37:05 crc kubenswrapper[4882]: I1002 16:37:05.030048 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8x4zn" event={"ID":"10d444aa-bebd-4e21-bff0-c6ad1be5fc18","Type":"ContainerStarted","Data":"e87d46b8c4b9c4e1d029bb60ac23e02cb1025402fb1c0203620330877f0c8b8a"} Oct 02 16:37:05 crc kubenswrapper[4882]: I1002 16:37:05.051499 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8x4zn" podStartSLOduration=6.663744704 podStartE2EDuration="15.051471995s" podCreationTimestamp="2025-10-02 16:36:50 +0000 UTC" firstStartedPulling="2025-10-02 16:36:55.395118903 +0000 UTC m=+1174.144348430" lastFinishedPulling="2025-10-02 16:37:03.782846194 +0000 UTC m=+1182.532075721" observedRunningTime="2025-10-02 16:37:05.04850527 +0000 UTC m=+1183.797734797" watchObservedRunningTime="2025-10-02 16:37:05.051471995 +0000 UTC m=+1183.800701522" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:09.390365 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:09.390799 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:09.390860 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:09.391543 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e31cec89ec1abb79b55918e38d3f35660f646818c4983c4b5f2f16b7f0dee66d"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:09.391614 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://e31cec89ec1abb79b55918e38d3f35660f646818c4983c4b5f2f16b7f0dee66d" gracePeriod=600 Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.684191 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a969-account-create-xfhzm"] Oct 02 16:37:12 crc kubenswrapper[4882]: E1002 16:37:10.685073 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4991e502-56d0-483d-b0cb-8c5d2e3d9282" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.685088 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="4991e502-56d0-483d-b0cb-8c5d2e3d9282" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: E1002 16:37:10.685107 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf4780a-ca9d-46f8-8295-50e6154f70aa" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.685113 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf4780a-ca9d-46f8-8295-50e6154f70aa" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: E1002 16:37:10.685131 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b09392d-ac00-4006-a716-d489b76594ed" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.685138 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b09392d-ac00-4006-a716-d489b76594ed" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: E1002 16:37:10.685153 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1576281-8163-4918-82e6-ff3529fa9702" containerName="ovn-config" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.685159 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1576281-8163-4918-82e6-ff3529fa9702" containerName="ovn-config" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.685391 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="4991e502-56d0-483d-b0cb-8c5d2e3d9282" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.685424 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf4780a-ca9d-46f8-8295-50e6154f70aa" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.685444 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b09392d-ac00-4006-a716-d489b76594ed" containerName="mariadb-database-create" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.685465 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1576281-8163-4918-82e6-ff3529fa9702" containerName="ovn-config" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.686137 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a969-account-create-xfhzm" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.688525 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.692533 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a969-account-create-xfhzm"] Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.772032 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6e73-account-create-qc4g4"] Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.774155 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e73-account-create-qc4g4" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.783021 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.788231 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6e73-account-create-qc4g4"] Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.803282 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6qt\" (UniqueName: \"kubernetes.io/projected/bfbbf5b9-098a-4705-921b-8bff1b195af8-kube-api-access-cw6qt\") pod \"cinder-a969-account-create-xfhzm\" (UID: \"bfbbf5b9-098a-4705-921b-8bff1b195af8\") " pod="openstack/cinder-a969-account-create-xfhzm" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.904962 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6qt\" (UniqueName: \"kubernetes.io/projected/bfbbf5b9-098a-4705-921b-8bff1b195af8-kube-api-access-cw6qt\") pod \"cinder-a969-account-create-xfhzm\" (UID: \"bfbbf5b9-098a-4705-921b-8bff1b195af8\") " pod="openstack/cinder-a969-account-create-xfhzm" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.905078 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt8kk\" (UniqueName: \"kubernetes.io/projected/f2e66594-8ce0-4953-a294-039c0d04d069-kube-api-access-pt8kk\") pod \"barbican-6e73-account-create-qc4g4\" (UID: \"f2e66594-8ce0-4953-a294-039c0d04d069\") " pod="openstack/barbican-6e73-account-create-qc4g4" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:10.924027 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6qt\" (UniqueName: \"kubernetes.io/projected/bfbbf5b9-098a-4705-921b-8bff1b195af8-kube-api-access-cw6qt\") pod \"cinder-a969-account-create-xfhzm\" (UID: \"bfbbf5b9-098a-4705-921b-8bff1b195af8\") " pod="openstack/cinder-a969-account-create-xfhzm" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.006766 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt8kk\" (UniqueName: \"kubernetes.io/projected/f2e66594-8ce0-4953-a294-039c0d04d069-kube-api-access-pt8kk\") pod \"barbican-6e73-account-create-qc4g4\" (UID: \"f2e66594-8ce0-4953-a294-039c0d04d069\") " pod="openstack/barbican-6e73-account-create-qc4g4" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.016148 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a969-account-create-xfhzm" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.027031 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt8kk\" (UniqueName: \"kubernetes.io/projected/f2e66594-8ce0-4953-a294-039c0d04d069-kube-api-access-pt8kk\") pod \"barbican-6e73-account-create-qc4g4\" (UID: \"f2e66594-8ce0-4953-a294-039c0d04d069\") " pod="openstack/barbican-6e73-account-create-qc4g4" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.064489 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0481-account-create-crqcr"] Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.065574 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0481-account-create-crqcr" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.068510 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.072677 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0481-account-create-crqcr"] Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.108655 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8hzv\" (UniqueName: \"kubernetes.io/projected/71290daa-6e18-4227-9a32-b73f3c816d89-kube-api-access-d8hzv\") pod \"neutron-0481-account-create-crqcr\" (UID: \"71290daa-6e18-4227-9a32-b73f3c816d89\") " pod="openstack/neutron-0481-account-create-crqcr" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.116205 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e73-account-create-qc4g4" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.210197 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8hzv\" (UniqueName: \"kubernetes.io/projected/71290daa-6e18-4227-9a32-b73f3c816d89-kube-api-access-d8hzv\") pod \"neutron-0481-account-create-crqcr\" (UID: \"71290daa-6e18-4227-9a32-b73f3c816d89\") " pod="openstack/neutron-0481-account-create-crqcr" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.231184 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8hzv\" (UniqueName: \"kubernetes.io/projected/71290daa-6e18-4227-9a32-b73f3c816d89-kube-api-access-d8hzv\") pod \"neutron-0481-account-create-crqcr\" (UID: \"71290daa-6e18-4227-9a32-b73f3c816d89\") " pod="openstack/neutron-0481-account-create-crqcr" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:11.398106 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0481-account-create-crqcr" Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:12.884710 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6e73-account-create-qc4g4"] Oct 02 16:37:12 crc kubenswrapper[4882]: W1002 16:37:12.891609 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2e66594_8ce0_4953_a294_039c0d04d069.slice/crio-eaf1639b8f34524eb407ab20260333d47369b545449a2787629686d4139d5f5c WatchSource:0}: Error finding container eaf1639b8f34524eb407ab20260333d47369b545449a2787629686d4139d5f5c: Status 404 returned error can't find the container with id eaf1639b8f34524eb407ab20260333d47369b545449a2787629686d4139d5f5c Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:12.936315 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a969-account-create-xfhzm"] Oct 02 16:37:12 crc kubenswrapper[4882]: I1002 16:37:12.946381 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0481-account-create-crqcr"] Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.108345 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"7c21495965895e4c5e03f6e2ccf050c5159064c4ed28968f122e21de15ea7bf0"} Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.108393 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"0df9344da4286c51066abdcf7f4044bd806c337ad8fb5c3fd6ad431458d2179a"} Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.110357 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a969-account-create-xfhzm" event={"ID":"bfbbf5b9-098a-4705-921b-8bff1b195af8","Type":"ContainerStarted","Data":"5654061d8a9e3060d68cda56a3531f06620ce37c173e7f131624d9c0952066b8"} Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.123232 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="e31cec89ec1abb79b55918e38d3f35660f646818c4983c4b5f2f16b7f0dee66d" exitCode=0 Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.123365 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"e31cec89ec1abb79b55918e38d3f35660f646818c4983c4b5f2f16b7f0dee66d"} Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.123403 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"4817b7e232fd2cfe282905e3863c3f81d1a0be19ec05b6f8eef5289d492e445b"} Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.123423 4882 scope.go:117] "RemoveContainer" containerID="f44cd146d47205c1f6441437b6ff7350cb43493b056fc71a20f480df78729e48" Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.139057 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e73-account-create-qc4g4" event={"ID":"f2e66594-8ce0-4953-a294-039c0d04d069","Type":"ContainerStarted","Data":"861f9f1e4489ef479933426da2698204038748e21a662991a1a2c346c552784f"} Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.139132 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e73-account-create-qc4g4" event={"ID":"f2e66594-8ce0-4953-a294-039c0d04d069","Type":"ContainerStarted","Data":"eaf1639b8f34524eb407ab20260333d47369b545449a2787629686d4139d5f5c"} Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.143092 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0481-account-create-crqcr" event={"ID":"71290daa-6e18-4227-9a32-b73f3c816d89","Type":"ContainerStarted","Data":"f1ebf614eae07daee58a1c19f5106d49c4a1d6eb506c85af72058d5dbd063e22"} Oct 02 16:37:13 crc kubenswrapper[4882]: I1002 16:37:13.167368 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-6e73-account-create-qc4g4" podStartSLOduration=3.1673368809999998 podStartE2EDuration="3.167336881s" podCreationTimestamp="2025-10-02 16:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:13.165807662 +0000 UTC m=+1191.915037209" watchObservedRunningTime="2025-10-02 16:37:13.167336881 +0000 UTC m=+1191.916566408" Oct 02 16:37:14 crc kubenswrapper[4882]: I1002 16:37:14.158943 4882 generic.go:334] "Generic (PLEG): container finished" podID="bfbbf5b9-098a-4705-921b-8bff1b195af8" containerID="a671a16f869006d4662756534716ca55280fdb96d3705b7a0f97ecb4b4a9fd45" exitCode=0 Oct 02 16:37:14 crc kubenswrapper[4882]: I1002 16:37:14.159051 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a969-account-create-xfhzm" event={"ID":"bfbbf5b9-098a-4705-921b-8bff1b195af8","Type":"ContainerDied","Data":"a671a16f869006d4662756534716ca55280fdb96d3705b7a0f97ecb4b4a9fd45"} Oct 02 16:37:14 crc kubenswrapper[4882]: I1002 16:37:14.168026 4882 generic.go:334] "Generic (PLEG): container finished" podID="f2e66594-8ce0-4953-a294-039c0d04d069" containerID="861f9f1e4489ef479933426da2698204038748e21a662991a1a2c346c552784f" exitCode=0 Oct 02 16:37:14 crc kubenswrapper[4882]: I1002 16:37:14.168132 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e73-account-create-qc4g4" event={"ID":"f2e66594-8ce0-4953-a294-039c0d04d069","Type":"ContainerDied","Data":"861f9f1e4489ef479933426da2698204038748e21a662991a1a2c346c552784f"} Oct 02 16:37:14 crc kubenswrapper[4882]: I1002 16:37:14.171304 4882 generic.go:334] "Generic (PLEG): container finished" podID="71290daa-6e18-4227-9a32-b73f3c816d89" containerID="1583bc602eccf3526a8f4e6c9a0718e5060e03369a08c1eca224a21e74cae345" exitCode=0 Oct 02 16:37:14 crc kubenswrapper[4882]: I1002 16:37:14.171528 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0481-account-create-crqcr" event={"ID":"71290daa-6e18-4227-9a32-b73f3c816d89","Type":"ContainerDied","Data":"1583bc602eccf3526a8f4e6c9a0718e5060e03369a08c1eca224a21e74cae345"} Oct 02 16:37:14 crc kubenswrapper[4882]: I1002 16:37:14.184826 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"98c3f10e6bd125fee0ba2ce83eeb60ba8e4bb2141fa7130276214ba1b0155863"} Oct 02 16:37:14 crc kubenswrapper[4882]: I1002 16:37:14.185197 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"e68f25a45eb857be77e95e76cbef672ca88b8dfae0b6833226a9a98f31867462"} Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.852809 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a969-account-create-xfhzm" Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.859838 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e73-account-create-qc4g4" Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.877236 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0481-account-create-crqcr" Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.926677 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt8kk\" (UniqueName: \"kubernetes.io/projected/f2e66594-8ce0-4953-a294-039c0d04d069-kube-api-access-pt8kk\") pod \"f2e66594-8ce0-4953-a294-039c0d04d069\" (UID: \"f2e66594-8ce0-4953-a294-039c0d04d069\") " Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.926865 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw6qt\" (UniqueName: \"kubernetes.io/projected/bfbbf5b9-098a-4705-921b-8bff1b195af8-kube-api-access-cw6qt\") pod \"bfbbf5b9-098a-4705-921b-8bff1b195af8\" (UID: \"bfbbf5b9-098a-4705-921b-8bff1b195af8\") " Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.926913 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8hzv\" (UniqueName: \"kubernetes.io/projected/71290daa-6e18-4227-9a32-b73f3c816d89-kube-api-access-d8hzv\") pod \"71290daa-6e18-4227-9a32-b73f3c816d89\" (UID: \"71290daa-6e18-4227-9a32-b73f3c816d89\") " Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.936483 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71290daa-6e18-4227-9a32-b73f3c816d89-kube-api-access-d8hzv" (OuterVolumeSpecName: "kube-api-access-d8hzv") pod "71290daa-6e18-4227-9a32-b73f3c816d89" (UID: "71290daa-6e18-4227-9a32-b73f3c816d89"). InnerVolumeSpecName "kube-api-access-d8hzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.936480 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfbbf5b9-098a-4705-921b-8bff1b195af8-kube-api-access-cw6qt" (OuterVolumeSpecName: "kube-api-access-cw6qt") pod "bfbbf5b9-098a-4705-921b-8bff1b195af8" (UID: "bfbbf5b9-098a-4705-921b-8bff1b195af8"). InnerVolumeSpecName "kube-api-access-cw6qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:15 crc kubenswrapper[4882]: I1002 16:37:15.941339 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e66594-8ce0-4953-a294-039c0d04d069-kube-api-access-pt8kk" (OuterVolumeSpecName: "kube-api-access-pt8kk") pod "f2e66594-8ce0-4953-a294-039c0d04d069" (UID: "f2e66594-8ce0-4953-a294-039c0d04d069"). InnerVolumeSpecName "kube-api-access-pt8kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.029115 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw6qt\" (UniqueName: \"kubernetes.io/projected/bfbbf5b9-098a-4705-921b-8bff1b195af8-kube-api-access-cw6qt\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.029174 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8hzv\" (UniqueName: \"kubernetes.io/projected/71290daa-6e18-4227-9a32-b73f3c816d89-kube-api-access-d8hzv\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.029185 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt8kk\" (UniqueName: \"kubernetes.io/projected/f2e66594-8ce0-4953-a294-039c0d04d069-kube-api-access-pt8kk\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.206559 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a969-account-create-xfhzm" event={"ID":"bfbbf5b9-098a-4705-921b-8bff1b195af8","Type":"ContainerDied","Data":"5654061d8a9e3060d68cda56a3531f06620ce37c173e7f131624d9c0952066b8"} Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.206616 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5654061d8a9e3060d68cda56a3531f06620ce37c173e7f131624d9c0952066b8" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.206578 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a969-account-create-xfhzm" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.208547 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6e73-account-create-qc4g4" event={"ID":"f2e66594-8ce0-4953-a294-039c0d04d069","Type":"ContainerDied","Data":"eaf1639b8f34524eb407ab20260333d47369b545449a2787629686d4139d5f5c"} Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.208591 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf1639b8f34524eb407ab20260333d47369b545449a2787629686d4139d5f5c" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.208547 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6e73-account-create-qc4g4" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.211658 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0481-account-create-crqcr" event={"ID":"71290daa-6e18-4227-9a32-b73f3c816d89","Type":"ContainerDied","Data":"f1ebf614eae07daee58a1c19f5106d49c4a1d6eb506c85af72058d5dbd063e22"} Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.211710 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1ebf614eae07daee58a1c19f5106d49c4a1d6eb506c85af72058d5dbd063e22" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.211672 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0481-account-create-crqcr" Oct 02 16:37:16 crc kubenswrapper[4882]: I1002 16:37:16.223613 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"64ceceaaaf2974dd1ac76651c1df1b0fc8e1bf419b93266c480f050b334d0268"} Oct 02 16:37:17 crc kubenswrapper[4882]: I1002 16:37:17.241949 4882 generic.go:334] "Generic (PLEG): container finished" podID="10d444aa-bebd-4e21-bff0-c6ad1be5fc18" containerID="e87d46b8c4b9c4e1d029bb60ac23e02cb1025402fb1c0203620330877f0c8b8a" exitCode=0 Oct 02 16:37:17 crc kubenswrapper[4882]: I1002 16:37:17.242094 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8x4zn" event={"ID":"10d444aa-bebd-4e21-bff0-c6ad1be5fc18","Type":"ContainerDied","Data":"e87d46b8c4b9c4e1d029bb60ac23e02cb1025402fb1c0203620330877f0c8b8a"} Oct 02 16:37:17 crc kubenswrapper[4882]: I1002 16:37:17.250811 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"7f14fdb5932d30e2872b1a5885f0b55773e7edd9a6545624f4e0a5cc89fc9950"} Oct 02 16:37:17 crc kubenswrapper[4882]: I1002 16:37:17.250863 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"a02d6df731a02fe06af482e56bb798e1405d30e9a7582568ae70058f44d8649b"} Oct 02 16:37:17 crc kubenswrapper[4882]: I1002 16:37:17.250877 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"dd782c2cb9bfe49afd713e8f98727186b4594352b4f94c3334399dcf0f3ddcde"} Oct 02 16:37:17 crc kubenswrapper[4882]: I1002 16:37:17.250891 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"c324410a34d486d3741d64c1759808560c80771c9771319557a7fa0a96becbf8"} Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.271373 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"beb784c14f77f12a3964b2ef2082da3578704e6f5267fe025604c315cd46e6ba"} Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.271757 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerStarted","Data":"01c2f3830e5d56cd56ba9cb9e3532e635f8aa8907530ce2426f74c73661d8d82"} Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.315813 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=44.975712663 podStartE2EDuration="1m5.315787846s" podCreationTimestamp="2025-10-02 16:36:13 +0000 UTC" firstStartedPulling="2025-10-02 16:36:55.480333788 +0000 UTC m=+1174.229563305" lastFinishedPulling="2025-10-02 16:37:15.820408961 +0000 UTC m=+1194.569638488" observedRunningTime="2025-10-02 16:37:18.312810841 +0000 UTC m=+1197.062040398" watchObservedRunningTime="2025-10-02 16:37:18.315787846 +0000 UTC m=+1197.065017383" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.562696 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.576159 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-config-data\") pod \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.576272 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-combined-ca-bundle\") pod \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.576307 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hht6\" (UniqueName: \"kubernetes.io/projected/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-kube-api-access-9hht6\") pod \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\" (UID: \"10d444aa-bebd-4e21-bff0-c6ad1be5fc18\") " Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.584062 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-kube-api-access-9hht6" (OuterVolumeSpecName: "kube-api-access-9hht6") pod "10d444aa-bebd-4e21-bff0-c6ad1be5fc18" (UID: "10d444aa-bebd-4e21-bff0-c6ad1be5fc18"). InnerVolumeSpecName "kube-api-access-9hht6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.607623 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5788dcf4dc-fwb7d"] Oct 02 16:37:18 crc kubenswrapper[4882]: E1002 16:37:18.607940 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d444aa-bebd-4e21-bff0-c6ad1be5fc18" containerName="keystone-db-sync" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.607956 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d444aa-bebd-4e21-bff0-c6ad1be5fc18" containerName="keystone-db-sync" Oct 02 16:37:18 crc kubenswrapper[4882]: E1002 16:37:18.607968 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbbf5b9-098a-4705-921b-8bff1b195af8" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.607975 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbbf5b9-098a-4705-921b-8bff1b195af8" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: E1002 16:37:18.607992 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e66594-8ce0-4953-a294-039c0d04d069" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.608000 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e66594-8ce0-4953-a294-039c0d04d069" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: E1002 16:37:18.608016 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71290daa-6e18-4227-9a32-b73f3c816d89" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.608021 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="71290daa-6e18-4227-9a32-b73f3c816d89" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.608178 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="71290daa-6e18-4227-9a32-b73f3c816d89" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.608192 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbbf5b9-098a-4705-921b-8bff1b195af8" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.608227 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e66594-8ce0-4953-a294-039c0d04d069" containerName="mariadb-account-create" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.608236 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d444aa-bebd-4e21-bff0-c6ad1be5fc18" containerName="keystone-db-sync" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.609029 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.614270 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.627941 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10d444aa-bebd-4e21-bff0-c6ad1be5fc18" (UID: "10d444aa-bebd-4e21-bff0-c6ad1be5fc18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.630267 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5788dcf4dc-fwb7d"] Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.643100 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-config-data" (OuterVolumeSpecName: "config-data") pod "10d444aa-bebd-4e21-bff0-c6ad1be5fc18" (UID: "10d444aa-bebd-4e21-bff0-c6ad1be5fc18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677240 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-svc\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677294 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-config\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677342 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-swift-storage-0\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677362 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vngck\" (UniqueName: \"kubernetes.io/projected/93e0ec04-22b6-48aa-bc81-435b400cb733-kube-api-access-vngck\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677390 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-sb\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677443 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-nb\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677572 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677612 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.677627 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hht6\" (UniqueName: \"kubernetes.io/projected/10d444aa-bebd-4e21-bff0-c6ad1be5fc18-kube-api-access-9hht6\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.780021 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-swift-storage-0\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.780096 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vngck\" (UniqueName: \"kubernetes.io/projected/93e0ec04-22b6-48aa-bc81-435b400cb733-kube-api-access-vngck\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.780153 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-sb\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.780323 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-nb\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.780435 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-svc\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.780503 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-config\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.781273 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-swift-storage-0\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.781277 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-sb\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.781623 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-svc\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.781811 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-config\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.781878 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-nb\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.799782 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vngck\" (UniqueName: \"kubernetes.io/projected/93e0ec04-22b6-48aa-bc81-435b400cb733-kube-api-access-vngck\") pod \"dnsmasq-dns-5788dcf4dc-fwb7d\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:18 crc kubenswrapper[4882]: I1002 16:37:18.995166 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.282781 4882 generic.go:334] "Generic (PLEG): container finished" podID="6af4887d-ad2c-42e6-a473-88947a33d7cd" containerID="d567389cb25bbabc1546297bf134bd90f58b2f6f44e328db815cbb62df15308f" exitCode=0 Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.282986 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8kjnt" event={"ID":"6af4887d-ad2c-42e6-a473-88947a33d7cd","Type":"ContainerDied","Data":"d567389cb25bbabc1546297bf134bd90f58b2f6f44e328db815cbb62df15308f"} Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.286063 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8x4zn" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.286578 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8x4zn" event={"ID":"10d444aa-bebd-4e21-bff0-c6ad1be5fc18","Type":"ContainerDied","Data":"f13ca12a1f433b90bea7a49b312b006ffde89cb9e5508cc7d1070a538010ade1"} Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.286608 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f13ca12a1f433b90bea7a49b312b006ffde89cb9e5508cc7d1070a538010ade1" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.400698 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-89997"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.401823 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.405029 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q8cq5" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.405226 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.405274 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.405333 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.417517 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-89997"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.449013 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5788dcf4dc-fwb7d"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.492767 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5788dcf4dc-fwb7d"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.522235 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.528256 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.542930 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.596134 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-scripts\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.596198 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-credential-keys\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.596340 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc2k2\" (UniqueName: \"kubernetes.io/projected/5002216b-8942-42c3-abbe-9c426fa17da6-kube-api-access-pc2k2\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.596364 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-fernet-keys\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.596391 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-combined-ca-bundle\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.596430 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-config-data\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.615079 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.617645 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.619270 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.621203 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.631061 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.697298 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.697492 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.697591 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-svc\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.697735 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-scripts\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.697821 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.697926 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-credential-keys\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.698005 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc2k2\" (UniqueName: \"kubernetes.io/projected/5002216b-8942-42c3-abbe-9c426fa17da6-kube-api-access-pc2k2\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.698088 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-fernet-keys\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.698178 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcm8h\" (UniqueName: \"kubernetes.io/projected/73cb7217-803e-4ba7-945e-597bc6702c4f-kube-api-access-rcm8h\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.698300 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-combined-ca-bundle\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.698400 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-config\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.700097 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-config-data\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.710465 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-fernet-keys\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.710467 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-combined-ca-bundle\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.711960 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-scripts\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.712890 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-config-data\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.723985 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc2k2\" (UniqueName: \"kubernetes.io/projected/5002216b-8942-42c3-abbe-9c426fa17da6-kube-api-access-pc2k2\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.730718 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-credential-keys\") pod \"keystone-bootstrap-89997\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.739066 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805086 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805163 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-scripts\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805199 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805231 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805267 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpdkv\" (UniqueName: \"kubernetes.io/projected/0f30a029-08da-4bbe-8d68-6a02148e072c-kube-api-access-kpdkv\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805289 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-svc\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805322 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-run-httpd\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805358 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805381 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-config-data\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805400 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcm8h\" (UniqueName: \"kubernetes.io/projected/73cb7217-803e-4ba7-945e-597bc6702c4f-kube-api-access-rcm8h\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805422 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805446 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-log-httpd\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.805463 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-config\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.806903 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-config\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.807626 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.807791 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.807820 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-svc\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.807920 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.816579 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf"] Oct 02 16:37:19 crc kubenswrapper[4882]: E1002 16:37:19.817229 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rcm8h], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" podUID="73cb7217-803e-4ba7-945e-597bc6702c4f" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.836501 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcm8h\" (UniqueName: \"kubernetes.io/projected/73cb7217-803e-4ba7-945e-597bc6702c4f-kube-api-access-rcm8h\") pod \"dnsmasq-dns-7fc4b7d6c5-qcqgf\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.850017 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b97d8f565-7ht8b"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.851378 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.863422 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b97d8f565-7ht8b"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.881556 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fcqvb"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.883903 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.887502 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5v96v" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.887737 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.887856 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.906775 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-run-httpd\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.906877 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-config-data\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.906909 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.906940 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-log-httpd\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.906972 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.907044 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-scripts\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.907064 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpdkv\" (UniqueName: \"kubernetes.io/projected/0f30a029-08da-4bbe-8d68-6a02148e072c-kube-api-access-kpdkv\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.907360 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-run-httpd\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.909445 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-log-httpd\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.916678 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-config-data\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.916842 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.928926 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-scripts\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.929312 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fcqvb"] Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.930691 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:19 crc kubenswrapper[4882]: I1002 16:37:19.934157 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpdkv\" (UniqueName: \"kubernetes.io/projected/0f30a029-08da-4bbe-8d68-6a02148e072c-kube-api-access-kpdkv\") pod \"ceilometer-0\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " pod="openstack/ceilometer-0" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008128 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f7bh\" (UniqueName: \"kubernetes.io/projected/b8ccb6a5-8202-4bf2-8274-73ab575b6066-kube-api-access-8f7bh\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008202 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-nb\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008251 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-swift-storage-0\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008377 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-scripts\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008410 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-config\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008426 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w64fp\" (UniqueName: \"kubernetes.io/projected/1698fd61-b948-4440-a0c2-b4cb3a7f933c-kube-api-access-w64fp\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008452 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-svc\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008470 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-config-data\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008495 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1698fd61-b948-4440-a0c2-b4cb3a7f933c-logs\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008514 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-sb\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.008548 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-combined-ca-bundle\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.024686 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111690 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-scripts\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111756 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-config\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111782 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w64fp\" (UniqueName: \"kubernetes.io/projected/1698fd61-b948-4440-a0c2-b4cb3a7f933c-kube-api-access-w64fp\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111815 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-svc\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111835 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-config-data\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111869 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1698fd61-b948-4440-a0c2-b4cb3a7f933c-logs\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111900 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-sb\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111942 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-combined-ca-bundle\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.111978 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f7bh\" (UniqueName: \"kubernetes.io/projected/b8ccb6a5-8202-4bf2-8274-73ab575b6066-kube-api-access-8f7bh\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.112007 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-nb\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.112037 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-swift-storage-0\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.113466 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-swift-storage-0\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.114394 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-config\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.114460 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-svc\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.114574 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-sb\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.117073 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1698fd61-b948-4440-a0c2-b4cb3a7f933c-logs\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.119698 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-scripts\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.119935 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-nb\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.122082 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-combined-ca-bundle\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.122969 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-config-data\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.137905 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f7bh\" (UniqueName: \"kubernetes.io/projected/b8ccb6a5-8202-4bf2-8274-73ab575b6066-kube-api-access-8f7bh\") pod \"dnsmasq-dns-6b97d8f565-7ht8b\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.140138 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w64fp\" (UniqueName: \"kubernetes.io/projected/1698fd61-b948-4440-a0c2-b4cb3a7f933c-kube-api-access-w64fp\") pod \"placement-db-sync-fcqvb\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.175782 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.267599 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fcqvb" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.312657 4882 generic.go:334] "Generic (PLEG): container finished" podID="93e0ec04-22b6-48aa-bc81-435b400cb733" containerID="9b7ca77af7524648f7c7c3ad2a42db3bc652ac071e57203ec41563715cb50ab8" exitCode=0 Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.312753 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.313292 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" event={"ID":"93e0ec04-22b6-48aa-bc81-435b400cb733","Type":"ContainerDied","Data":"9b7ca77af7524648f7c7c3ad2a42db3bc652ac071e57203ec41563715cb50ab8"} Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.313349 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" event={"ID":"93e0ec04-22b6-48aa-bc81-435b400cb733","Type":"ContainerStarted","Data":"8b6d48ddd039b5e384f529edb0677a126786ec9e579521660682afe3a4e858f5"} Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.341974 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-89997"] Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.371894 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:20 crc kubenswrapper[4882]: W1002 16:37:20.444530 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5002216b_8942_42c3_abbe_9c426fa17da6.slice/crio-d8d322a9fbd975f277332198ca24c8fff3fb8450bf007ecd823bd2794ef897e4 WatchSource:0}: Error finding container d8d322a9fbd975f277332198ca24c8fff3fb8450bf007ecd823bd2794ef897e4: Status 404 returned error can't find the container with id d8d322a9fbd975f277332198ca24c8fff3fb8450bf007ecd823bd2794ef897e4 Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.518337 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-svc\") pod \"73cb7217-803e-4ba7-945e-597bc6702c4f\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.518683 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-sb\") pod \"73cb7217-803e-4ba7-945e-597bc6702c4f\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.518734 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcm8h\" (UniqueName: \"kubernetes.io/projected/73cb7217-803e-4ba7-945e-597bc6702c4f-kube-api-access-rcm8h\") pod \"73cb7217-803e-4ba7-945e-597bc6702c4f\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.518766 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-config\") pod \"73cb7217-803e-4ba7-945e-597bc6702c4f\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.518795 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-nb\") pod \"73cb7217-803e-4ba7-945e-597bc6702c4f\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.518895 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-swift-storage-0\") pod \"73cb7217-803e-4ba7-945e-597bc6702c4f\" (UID: \"73cb7217-803e-4ba7-945e-597bc6702c4f\") " Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.519150 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73cb7217-803e-4ba7-945e-597bc6702c4f" (UID: "73cb7217-803e-4ba7-945e-597bc6702c4f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.519477 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.519550 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "73cb7217-803e-4ba7-945e-597bc6702c4f" (UID: "73cb7217-803e-4ba7-945e-597bc6702c4f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.519771 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-config" (OuterVolumeSpecName: "config") pod "73cb7217-803e-4ba7-945e-597bc6702c4f" (UID: "73cb7217-803e-4ba7-945e-597bc6702c4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.520135 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73cb7217-803e-4ba7-945e-597bc6702c4f" (UID: "73cb7217-803e-4ba7-945e-597bc6702c4f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.520538 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73cb7217-803e-4ba7-945e-597bc6702c4f" (UID: "73cb7217-803e-4ba7-945e-597bc6702c4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.524905 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73cb7217-803e-4ba7-945e-597bc6702c4f-kube-api-access-rcm8h" (OuterVolumeSpecName: "kube-api-access-rcm8h") pod "73cb7217-803e-4ba7-945e-597bc6702c4f" (UID: "73cb7217-803e-4ba7-945e-597bc6702c4f"). InnerVolumeSpecName "kube-api-access-rcm8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.561890 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.620489 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.620518 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.620528 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcm8h\" (UniqueName: \"kubernetes.io/projected/73cb7217-803e-4ba7-945e-597bc6702c4f-kube-api-access-rcm8h\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.620541 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.620549 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73cb7217-803e-4ba7-945e-597bc6702c4f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.884428 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-spmcq"] Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.885990 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.892427 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pr5rh" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.892457 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.893326 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.912119 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-spmcq"] Oct 02 16:37:20 crc kubenswrapper[4882]: I1002 16:37:20.981029 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.038235 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-config-data\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.038296 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-scripts\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.038316 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrkd\" (UniqueName: \"kubernetes.io/projected/39804fe6-5476-4f26-a743-83c1852229ec-kube-api-access-qwrkd\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.038344 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-db-sync-config-data\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.038409 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39804fe6-5476-4f26-a743-83c1852229ec-etc-machine-id\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.038425 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-combined-ca-bundle\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.068837 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fcqvb"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.139338 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-config\") pod \"93e0ec04-22b6-48aa-bc81-435b400cb733\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.139421 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-swift-storage-0\") pod \"93e0ec04-22b6-48aa-bc81-435b400cb733\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.139578 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-svc\") pod \"93e0ec04-22b6-48aa-bc81-435b400cb733\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.139624 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-nb\") pod \"93e0ec04-22b6-48aa-bc81-435b400cb733\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.139668 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-sb\") pod \"93e0ec04-22b6-48aa-bc81-435b400cb733\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.139721 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vngck\" (UniqueName: \"kubernetes.io/projected/93e0ec04-22b6-48aa-bc81-435b400cb733-kube-api-access-vngck\") pod \"93e0ec04-22b6-48aa-bc81-435b400cb733\" (UID: \"93e0ec04-22b6-48aa-bc81-435b400cb733\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.140513 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39804fe6-5476-4f26-a743-83c1852229ec-etc-machine-id\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.140555 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-combined-ca-bundle\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.140616 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-config-data\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.140646 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-scripts\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.140670 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrkd\" (UniqueName: \"kubernetes.io/projected/39804fe6-5476-4f26-a743-83c1852229ec-kube-api-access-qwrkd\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.140700 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-db-sync-config-data\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.143608 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39804fe6-5476-4f26-a743-83c1852229ec-etc-machine-id\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.157629 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-combined-ca-bundle\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.166664 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-db-sync-config-data\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.168370 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-scripts\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.171603 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e0ec04-22b6-48aa-bc81-435b400cb733-kube-api-access-vngck" (OuterVolumeSpecName: "kube-api-access-vngck") pod "93e0ec04-22b6-48aa-bc81-435b400cb733" (UID: "93e0ec04-22b6-48aa-bc81-435b400cb733"). InnerVolumeSpecName "kube-api-access-vngck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.175245 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrkd\" (UniqueName: \"kubernetes.io/projected/39804fe6-5476-4f26-a743-83c1852229ec-kube-api-access-qwrkd\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.177771 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-config-data\") pod \"cinder-db-sync-spmcq\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.193489 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93e0ec04-22b6-48aa-bc81-435b400cb733" (UID: "93e0ec04-22b6-48aa-bc81-435b400cb733"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.208469 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93e0ec04-22b6-48aa-bc81-435b400cb733" (UID: "93e0ec04-22b6-48aa-bc81-435b400cb733"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.215085 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8kjnt" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.217235 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93e0ec04-22b6-48aa-bc81-435b400cb733" (UID: "93e0ec04-22b6-48aa-bc81-435b400cb733"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.217770 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93e0ec04-22b6-48aa-bc81-435b400cb733" (UID: "93e0ec04-22b6-48aa-bc81-435b400cb733"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.221922 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-config" (OuterVolumeSpecName: "config") pod "93e0ec04-22b6-48aa-bc81-435b400cb733" (UID: "93e0ec04-22b6-48aa-bc81-435b400cb733"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.242527 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.242746 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vngck\" (UniqueName: \"kubernetes.io/projected/93e0ec04-22b6-48aa-bc81-435b400cb733-kube-api-access-vngck\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.242814 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.242924 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.243050 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.243118 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93e0ec04-22b6-48aa-bc81-435b400cb733-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.293593 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-k42n2"] Oct 02 16:37:21 crc kubenswrapper[4882]: E1002 16:37:21.293946 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e0ec04-22b6-48aa-bc81-435b400cb733" containerName="init" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.293964 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e0ec04-22b6-48aa-bc81-435b400cb733" containerName="init" Oct 02 16:37:21 crc kubenswrapper[4882]: E1002 16:37:21.293982 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af4887d-ad2c-42e6-a473-88947a33d7cd" containerName="glance-db-sync" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.293990 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af4887d-ad2c-42e6-a473-88947a33d7cd" containerName="glance-db-sync" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.294181 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af4887d-ad2c-42e6-a473-88947a33d7cd" containerName="glance-db-sync" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.294200 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e0ec04-22b6-48aa-bc81-435b400cb733" containerName="init" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.294916 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.299537 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.299848 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bfwps" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.312433 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k42n2"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.317079 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-spmcq" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.329321 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f30a029-08da-4bbe-8d68-6a02148e072c","Type":"ContainerStarted","Data":"86dc7981059b9d97b0eedfb462a4cc0174221e5e19c4ea286cc91b5cf589d50d"} Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.331228 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-89997" event={"ID":"5002216b-8942-42c3-abbe-9c426fa17da6","Type":"ContainerStarted","Data":"7ccab1093e8e71e85336832cf7c700db0388ad4acfad34fbf7a1afe604fdcb53"} Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.331272 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-89997" event={"ID":"5002216b-8942-42c3-abbe-9c426fa17da6","Type":"ContainerStarted","Data":"d8d322a9fbd975f277332198ca24c8fff3fb8450bf007ecd823bd2794ef897e4"} Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.334709 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b97d8f565-7ht8b"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.350992 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-config-data\") pod \"6af4887d-ad2c-42e6-a473-88947a33d7cd\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.351139 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-db-sync-config-data\") pod \"6af4887d-ad2c-42e6-a473-88947a33d7cd\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.351227 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxh2v\" (UniqueName: \"kubernetes.io/projected/6af4887d-ad2c-42e6-a473-88947a33d7cd-kube-api-access-wxh2v\") pod \"6af4887d-ad2c-42e6-a473-88947a33d7cd\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.351255 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-combined-ca-bundle\") pod \"6af4887d-ad2c-42e6-a473-88947a33d7cd\" (UID: \"6af4887d-ad2c-42e6-a473-88947a33d7cd\") " Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.362627 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6af4887d-ad2c-42e6-a473-88947a33d7cd" (UID: "6af4887d-ad2c-42e6-a473-88947a33d7cd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.363622 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-89997" podStartSLOduration=2.36360783 podStartE2EDuration="2.36360783s" podCreationTimestamp="2025-10-02 16:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:21.363135068 +0000 UTC m=+1200.112364595" watchObservedRunningTime="2025-10-02 16:37:21.36360783 +0000 UTC m=+1200.112837357" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.365891 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8kjnt" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.366256 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8kjnt" event={"ID":"6af4887d-ad2c-42e6-a473-88947a33d7cd","Type":"ContainerDied","Data":"3e5aa6c13400757d73bf8fe7c4216d804d2cc7b2f26cde8e7c03978b57e84336"} Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.366314 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e5aa6c13400757d73bf8fe7c4216d804d2cc7b2f26cde8e7c03978b57e84336" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.383124 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af4887d-ad2c-42e6-a473-88947a33d7cd-kube-api-access-wxh2v" (OuterVolumeSpecName: "kube-api-access-wxh2v") pod "6af4887d-ad2c-42e6-a473-88947a33d7cd" (UID: "6af4887d-ad2c-42e6-a473-88947a33d7cd"). InnerVolumeSpecName "kube-api-access-wxh2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.395705 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" event={"ID":"93e0ec04-22b6-48aa-bc81-435b400cb733","Type":"ContainerDied","Data":"8b6d48ddd039b5e384f529edb0677a126786ec9e579521660682afe3a4e858f5"} Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.396612 4882 scope.go:117] "RemoveContainer" containerID="9b7ca77af7524648f7c7c3ad2a42db3bc652ac071e57203ec41563715cb50ab8" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.395818 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5788dcf4dc-fwb7d" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.410681 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fcqvb" event={"ID":"1698fd61-b948-4440-a0c2-b4cb3a7f933c","Type":"ContainerStarted","Data":"1733c221706c1533481b08cb762880058a1f4a8d0ccbf96ce79623a738c6fe7a"} Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.410778 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.417829 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6af4887d-ad2c-42e6-a473-88947a33d7cd" (UID: "6af4887d-ad2c-42e6-a473-88947a33d7cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.456292 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfjj7\" (UniqueName: \"kubernetes.io/projected/68a0980b-843c-4fdd-a638-fe28f7bf4491-kube-api-access-vfjj7\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.456380 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-db-sync-config-data\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.456403 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-combined-ca-bundle\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.456490 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxh2v\" (UniqueName: \"kubernetes.io/projected/6af4887d-ad2c-42e6-a473-88947a33d7cd-kube-api-access-wxh2v\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.456502 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.456511 4882 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.462363 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-config-data" (OuterVolumeSpecName: "config-data") pod "6af4887d-ad2c-42e6-a473-88947a33d7cd" (UID: "6af4887d-ad2c-42e6-a473-88947a33d7cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.502318 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.527756 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc4b7d6c5-qcqgf"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.557545 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfjj7\" (UniqueName: \"kubernetes.io/projected/68a0980b-843c-4fdd-a638-fe28f7bf4491-kube-api-access-vfjj7\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.557635 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-db-sync-config-data\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.557658 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-combined-ca-bundle\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.557740 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af4887d-ad2c-42e6-a473-88947a33d7cd-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.585883 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-combined-ca-bundle\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.594853 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8rr77"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.596238 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.598451 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.599264 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-db-sync-config-data\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.600096 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wnxs6" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.606299 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.607861 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfjj7\" (UniqueName: \"kubernetes.io/projected/68a0980b-843c-4fdd-a638-fe28f7bf4491-kube-api-access-vfjj7\") pod \"barbican-db-sync-k42n2\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.612814 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k42n2" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.630593 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5788dcf4dc-fwb7d"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.638190 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5788dcf4dc-fwb7d"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.658860 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8rr77"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.764309 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-config\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.764670 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvzs7\" (UniqueName: \"kubernetes.io/projected/ef8e6711-063d-42f2-afb6-426e7617d062-kube-api-access-fvzs7\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.764719 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-combined-ca-bundle\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.842008 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b97d8f565-7ht8b"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.869986 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvzs7\" (UniqueName: \"kubernetes.io/projected/ef8e6711-063d-42f2-afb6-426e7617d062-kube-api-access-fvzs7\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.870086 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-combined-ca-bundle\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.870180 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-config\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.876413 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c9d7754df-vk4cc"] Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.885923 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-config\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.888660 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.929675 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-combined-ca-bundle\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:21 crc kubenswrapper[4882]: I1002 16:37:21.961069 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvzs7\" (UniqueName: \"kubernetes.io/projected/ef8e6711-063d-42f2-afb6-426e7617d062-kube-api-access-fvzs7\") pod \"neutron-db-sync-8rr77\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.104504 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c9d7754df-vk4cc"] Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.138726 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-svc\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.138788 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-swift-storage-0\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.138816 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-config\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.138831 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-nb\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.138885 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-sb\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.138951 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2jxp\" (UniqueName: \"kubernetes.io/projected/806e2852-f670-4614-9838-cabc0f4ebb79-kube-api-access-q2jxp\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.155727 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8rr77" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.193535 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.195096 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.203377 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.203580 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b2q2c" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.204396 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.217512 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.241515 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-sb\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.241703 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2jxp\" (UniqueName: \"kubernetes.io/projected/806e2852-f670-4614-9838-cabc0f4ebb79-kube-api-access-q2jxp\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.241808 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-svc\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.241888 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-swift-storage-0\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.241963 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-config\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.242025 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-nb\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.255048 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-sb\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.255226 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-swift-storage-0\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.255736 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-svc\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.256099 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-config\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.258107 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-nb\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.309958 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2jxp\" (UniqueName: \"kubernetes.io/projected/806e2852-f670-4614-9838-cabc0f4ebb79-kube-api-access-q2jxp\") pod \"dnsmasq-dns-c9d7754df-vk4cc\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.344105 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-config-data\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.344156 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.344205 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-scripts\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.344257 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjr9d\" (UniqueName: \"kubernetes.io/projected/585731cf-e419-4806-b770-d468b6360065-kube-api-access-tjr9d\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.344355 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.344380 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-logs\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.344397 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.361582 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-spmcq"] Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.445405 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.445469 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-logs\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.445490 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.445531 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-config-data\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.445549 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.445585 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-scripts\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.446156 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjr9d\" (UniqueName: \"kubernetes.io/projected/585731cf-e419-4806-b770-d468b6360065-kube-api-access-tjr9d\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.446588 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.446939 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.447740 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-logs\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.449839 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.453000 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-scripts\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.453041 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-config-data\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.475348 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-spmcq" event={"ID":"39804fe6-5476-4f26-a743-83c1852229ec","Type":"ContainerStarted","Data":"19ff9e29704d4744761969da2f8a685c5aa710805cf94f5bb9256d82e7548503"} Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.475967 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.494581 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.510702 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjr9d\" (UniqueName: \"kubernetes.io/projected/585731cf-e419-4806-b770-d468b6360065-kube-api-access-tjr9d\") pod \"glance-default-external-api-0\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.528057 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" event={"ID":"b8ccb6a5-8202-4bf2-8274-73ab575b6066","Type":"ContainerStarted","Data":"7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea"} Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.528095 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" podUID="b8ccb6a5-8202-4bf2-8274-73ab575b6066" containerName="init" containerID="cri-o://7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea" gracePeriod=10 Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.528131 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" event={"ID":"b8ccb6a5-8202-4bf2-8274-73ab575b6066","Type":"ContainerStarted","Data":"508d18f17fe66966db380cd007a0ad0ee17e60034e0f201a16c5423c3f9e9dbb"} Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.570320 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.579785 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.814433 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73cb7217-803e-4ba7-945e-597bc6702c4f" path="/var/lib/kubelet/pods/73cb7217-803e-4ba7-945e-597bc6702c4f/volumes" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.815350 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e0ec04-22b6-48aa-bc81-435b400cb733" path="/var/lib/kubelet/pods/93e0ec04-22b6-48aa-bc81-435b400cb733/volumes" Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.822035 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k42n2"] Oct 02 16:37:22 crc kubenswrapper[4882]: W1002 16:37:22.823044 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68a0980b_843c_4fdd_a638_fe28f7bf4491.slice/crio-1b734e147bbbe028ca4cc47abc817345193e3fc88c3b1432136477d76a469ca8 WatchSource:0}: Error finding container 1b734e147bbbe028ca4cc47abc817345193e3fc88c3b1432136477d76a469ca8: Status 404 returned error can't find the container with id 1b734e147bbbe028ca4cc47abc817345193e3fc88c3b1432136477d76a469ca8 Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.925092 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8rr77"] Oct 02 16:37:22 crc kubenswrapper[4882]: I1002 16:37:22.947935 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.072645 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-sb\") pod \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.072698 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-config\") pod \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.072730 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-swift-storage-0\") pod \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.072834 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-nb\") pod \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.072857 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-svc\") pod \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.072967 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f7bh\" (UniqueName: \"kubernetes.io/projected/b8ccb6a5-8202-4bf2-8274-73ab575b6066-kube-api-access-8f7bh\") pod \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\" (UID: \"b8ccb6a5-8202-4bf2-8274-73ab575b6066\") " Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.078039 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ccb6a5-8202-4bf2-8274-73ab575b6066-kube-api-access-8f7bh" (OuterVolumeSpecName: "kube-api-access-8f7bh") pod "b8ccb6a5-8202-4bf2-8274-73ab575b6066" (UID: "b8ccb6a5-8202-4bf2-8274-73ab575b6066"). InnerVolumeSpecName "kube-api-access-8f7bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.102190 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b8ccb6a5-8202-4bf2-8274-73ab575b6066" (UID: "b8ccb6a5-8202-4bf2-8274-73ab575b6066"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.113430 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-config" (OuterVolumeSpecName: "config") pod "b8ccb6a5-8202-4bf2-8274-73ab575b6066" (UID: "b8ccb6a5-8202-4bf2-8274-73ab575b6066"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.121732 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b8ccb6a5-8202-4bf2-8274-73ab575b6066" (UID: "b8ccb6a5-8202-4bf2-8274-73ab575b6066"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.128643 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8ccb6a5-8202-4bf2-8274-73ab575b6066" (UID: "b8ccb6a5-8202-4bf2-8274-73ab575b6066"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.139204 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b8ccb6a5-8202-4bf2-8274-73ab575b6066" (UID: "b8ccb6a5-8202-4bf2-8274-73ab575b6066"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.174957 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.174987 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.174997 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.175006 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f7bh\" (UniqueName: \"kubernetes.io/projected/b8ccb6a5-8202-4bf2-8274-73ab575b6066-kube-api-access-8f7bh\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.175016 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.175024 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ccb6a5-8202-4bf2-8274-73ab575b6066-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.326176 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:23 crc kubenswrapper[4882]: E1002 16:37:23.332116 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ccb6a5-8202-4bf2-8274-73ab575b6066" containerName="init" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.332593 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ccb6a5-8202-4bf2-8274-73ab575b6066" containerName="init" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.336002 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ccb6a5-8202-4bf2-8274-73ab575b6066" containerName="init" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.349789 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.361703 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.396937 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.439349 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c9d7754df-vk4cc"] Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.494492 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.497372 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.497456 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.497495 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.497527 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.497581 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.497597 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.497620 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpbbd\" (UniqueName: \"kubernetes.io/projected/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-kube-api-access-jpbbd\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.544404 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" event={"ID":"806e2852-f670-4614-9838-cabc0f4ebb79","Type":"ContainerStarted","Data":"4809120486a155d6cc7e141710fb0c02a747072924c746183dea5b9eed03cd2c"} Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.550731 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8rr77" event={"ID":"ef8e6711-063d-42f2-afb6-426e7617d062","Type":"ContainerStarted","Data":"49daefd74c00c9ed3bfcfe52dc5591e1096340894023627744e4282783dddf06"} Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.550833 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8rr77" event={"ID":"ef8e6711-063d-42f2-afb6-426e7617d062","Type":"ContainerStarted","Data":"53bcd453860992215d3d81580e89ab9189e91b8e0a53bbace5d197a907f7b694"} Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.557382 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k42n2" event={"ID":"68a0980b-843c-4fdd-a638-fe28f7bf4491","Type":"ContainerStarted","Data":"1b734e147bbbe028ca4cc47abc817345193e3fc88c3b1432136477d76a469ca8"} Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.562353 4882 generic.go:334] "Generic (PLEG): container finished" podID="b8ccb6a5-8202-4bf2-8274-73ab575b6066" containerID="7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea" exitCode=0 Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.562450 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" event={"ID":"b8ccb6a5-8202-4bf2-8274-73ab575b6066","Type":"ContainerDied","Data":"7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea"} Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.562518 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" event={"ID":"b8ccb6a5-8202-4bf2-8274-73ab575b6066","Type":"ContainerDied","Data":"508d18f17fe66966db380cd007a0ad0ee17e60034e0f201a16c5423c3f9e9dbb"} Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.562570 4882 scope.go:117] "RemoveContainer" containerID="7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.563652 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b97d8f565-7ht8b" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.577466 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8rr77" podStartSLOduration=2.577442604 podStartE2EDuration="2.577442604s" podCreationTimestamp="2025-10-02 16:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:23.570255102 +0000 UTC m=+1202.319484629" watchObservedRunningTime="2025-10-02 16:37:23.577442604 +0000 UTC m=+1202.326672131" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.600589 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.600660 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.600704 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.600817 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.600845 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.600923 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpbbd\" (UniqueName: \"kubernetes.io/projected/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-kube-api-access-jpbbd\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.600971 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.601496 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-logs\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.601725 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.601795 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.609311 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.609533 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.614531 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.621607 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpbbd\" (UniqueName: \"kubernetes.io/projected/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-kube-api-access-jpbbd\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.661825 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.701746 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.799812 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b97d8f565-7ht8b"] Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.805173 4882 scope.go:117] "RemoveContainer" containerID="7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea" Oct 02 16:37:23 crc kubenswrapper[4882]: E1002 16:37:23.807637 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea\": container with ID starting with 7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea not found: ID does not exist" containerID="7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.807678 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea"} err="failed to get container status \"7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea\": rpc error: code = NotFound desc = could not find container \"7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea\": container with ID starting with 7e8c4427c71ce8d6bb908b92cd11ed8dc81e269bbfa9fdcd7ce658eb7c7e8dea not found: ID does not exist" Oct 02 16:37:23 crc kubenswrapper[4882]: I1002 16:37:23.811312 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b97d8f565-7ht8b"] Oct 02 16:37:24 crc kubenswrapper[4882]: I1002 16:37:24.397994 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:24 crc kubenswrapper[4882]: W1002 16:37:24.428122 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95770dfd_a95d_496d_8b45_1b08ca9c6a7b.slice/crio-457f1c65d6a5de1107af5731b7893aeaebfebcd362f1948209387c8b64b72157 WatchSource:0}: Error finding container 457f1c65d6a5de1107af5731b7893aeaebfebcd362f1948209387c8b64b72157: Status 404 returned error can't find the container with id 457f1c65d6a5de1107af5731b7893aeaebfebcd362f1948209387c8b64b72157 Oct 02 16:37:24 crc kubenswrapper[4882]: I1002 16:37:24.586366 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95770dfd-a95d-496d-8b45-1b08ca9c6a7b","Type":"ContainerStarted","Data":"457f1c65d6a5de1107af5731b7893aeaebfebcd362f1948209387c8b64b72157"} Oct 02 16:37:24 crc kubenswrapper[4882]: I1002 16:37:24.610775 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"585731cf-e419-4806-b770-d468b6360065","Type":"ContainerStarted","Data":"893db928e41b1c820d7c429ec9571066a1b497f668a3093b576cb7a8cf52fa7c"} Oct 02 16:37:24 crc kubenswrapper[4882]: I1002 16:37:24.698706 4882 generic.go:334] "Generic (PLEG): container finished" podID="806e2852-f670-4614-9838-cabc0f4ebb79" containerID="75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921" exitCode=0 Oct 02 16:37:24 crc kubenswrapper[4882]: I1002 16:37:24.698801 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" event={"ID":"806e2852-f670-4614-9838-cabc0f4ebb79","Type":"ContainerDied","Data":"75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921"} Oct 02 16:37:24 crc kubenswrapper[4882]: I1002 16:37:24.815845 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ccb6a5-8202-4bf2-8274-73ab575b6066" path="/var/lib/kubelet/pods/b8ccb6a5-8202-4bf2-8274-73ab575b6066/volumes" Oct 02 16:37:25 crc kubenswrapper[4882]: I1002 16:37:25.722755 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" event={"ID":"806e2852-f670-4614-9838-cabc0f4ebb79","Type":"ContainerStarted","Data":"7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9"} Oct 02 16:37:25 crc kubenswrapper[4882]: I1002 16:37:25.723087 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:25 crc kubenswrapper[4882]: I1002 16:37:25.728374 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95770dfd-a95d-496d-8b45-1b08ca9c6a7b","Type":"ContainerStarted","Data":"5fb878c84cbf3ecf4d3a64a5f76830b0ee03407f0aa830694e62c1f16e0b88a1"} Oct 02 16:37:25 crc kubenswrapper[4882]: I1002 16:37:25.731434 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"585731cf-e419-4806-b770-d468b6360065","Type":"ContainerStarted","Data":"691f0b26042b1e93ccf0c11167d837fe96ca5a6bc48134dc8241642e1432945a"} Oct 02 16:37:25 crc kubenswrapper[4882]: I1002 16:37:25.753826 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" podStartSLOduration=4.75380311 podStartE2EDuration="4.75380311s" podCreationTimestamp="2025-10-02 16:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:25.748044844 +0000 UTC m=+1204.497274371" watchObservedRunningTime="2025-10-02 16:37:25.75380311 +0000 UTC m=+1204.503032647" Oct 02 16:37:26 crc kubenswrapper[4882]: I1002 16:37:26.760039 4882 generic.go:334] "Generic (PLEG): container finished" podID="5002216b-8942-42c3-abbe-9c426fa17da6" containerID="7ccab1093e8e71e85336832cf7c700db0388ad4acfad34fbf7a1afe604fdcb53" exitCode=0 Oct 02 16:37:26 crc kubenswrapper[4882]: I1002 16:37:26.780179 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-89997" event={"ID":"5002216b-8942-42c3-abbe-9c426fa17da6","Type":"ContainerDied","Data":"7ccab1093e8e71e85336832cf7c700db0388ad4acfad34fbf7a1afe604fdcb53"} Oct 02 16:37:26 crc kubenswrapper[4882]: I1002 16:37:26.780326 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95770dfd-a95d-496d-8b45-1b08ca9c6a7b","Type":"ContainerStarted","Data":"7c856eebf53a6a819f82a742b4ec639355d2fa7bae41ec826ef21b50e339c276"} Oct 02 16:37:26 crc kubenswrapper[4882]: I1002 16:37:26.780341 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"585731cf-e419-4806-b770-d468b6360065","Type":"ContainerStarted","Data":"bc1a714e599436e9f70fbbef7b086b063a649e77fab61b0b6b788cfecad52cca"} Oct 02 16:37:26 crc kubenswrapper[4882]: I1002 16:37:26.798006 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.7979798559999995 podStartE2EDuration="4.797979856s" podCreationTimestamp="2025-10-02 16:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:26.782932545 +0000 UTC m=+1205.532162072" watchObservedRunningTime="2025-10-02 16:37:26.797979856 +0000 UTC m=+1205.547209383" Oct 02 16:37:26 crc kubenswrapper[4882]: I1002 16:37:26.835589 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.835573446 podStartE2EDuration="5.835573446s" podCreationTimestamp="2025-10-02 16:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:26.830445116 +0000 UTC m=+1205.579674643" watchObservedRunningTime="2025-10-02 16:37:26.835573446 +0000 UTC m=+1205.584802963" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.109310 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.248862 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-fernet-keys\") pod \"5002216b-8942-42c3-abbe-9c426fa17da6\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.249050 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc2k2\" (UniqueName: \"kubernetes.io/projected/5002216b-8942-42c3-abbe-9c426fa17da6-kube-api-access-pc2k2\") pod \"5002216b-8942-42c3-abbe-9c426fa17da6\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.249312 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-combined-ca-bundle\") pod \"5002216b-8942-42c3-abbe-9c426fa17da6\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.249367 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-config-data\") pod \"5002216b-8942-42c3-abbe-9c426fa17da6\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.249403 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-credential-keys\") pod \"5002216b-8942-42c3-abbe-9c426fa17da6\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.249463 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-scripts\") pod \"5002216b-8942-42c3-abbe-9c426fa17da6\" (UID: \"5002216b-8942-42c3-abbe-9c426fa17da6\") " Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.258993 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5002216b-8942-42c3-abbe-9c426fa17da6" (UID: "5002216b-8942-42c3-abbe-9c426fa17da6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.259083 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5002216b-8942-42c3-abbe-9c426fa17da6-kube-api-access-pc2k2" (OuterVolumeSpecName: "kube-api-access-pc2k2") pod "5002216b-8942-42c3-abbe-9c426fa17da6" (UID: "5002216b-8942-42c3-abbe-9c426fa17da6"). InnerVolumeSpecName "kube-api-access-pc2k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.259269 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-scripts" (OuterVolumeSpecName: "scripts") pod "5002216b-8942-42c3-abbe-9c426fa17da6" (UID: "5002216b-8942-42c3-abbe-9c426fa17da6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.260586 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5002216b-8942-42c3-abbe-9c426fa17da6" (UID: "5002216b-8942-42c3-abbe-9c426fa17da6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.301661 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-config-data" (OuterVolumeSpecName: "config-data") pod "5002216b-8942-42c3-abbe-9c426fa17da6" (UID: "5002216b-8942-42c3-abbe-9c426fa17da6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.301865 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5002216b-8942-42c3-abbe-9c426fa17da6" (UID: "5002216b-8942-42c3-abbe-9c426fa17da6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.351721 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.351778 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.351790 4882 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.351801 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.351812 4882 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5002216b-8942-42c3-abbe-9c426fa17da6-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.351824 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc2k2\" (UniqueName: \"kubernetes.io/projected/5002216b-8942-42c3-abbe-9c426fa17da6-kube-api-access-pc2k2\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.838122 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-89997" event={"ID":"5002216b-8942-42c3-abbe-9c426fa17da6","Type":"ContainerDied","Data":"d8d322a9fbd975f277332198ca24c8fff3fb8450bf007ecd823bd2794ef897e4"} Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.838188 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d322a9fbd975f277332198ca24c8fff3fb8450bf007ecd823bd2794ef897e4" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.838284 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-89997" Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.882339 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.882997 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="585731cf-e419-4806-b770-d468b6360065" containerName="glance-log" containerID="cri-o://691f0b26042b1e93ccf0c11167d837fe96ca5a6bc48134dc8241642e1432945a" gracePeriod=30 Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.883670 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="585731cf-e419-4806-b770-d468b6360065" containerName="glance-httpd" containerID="cri-o://bc1a714e599436e9f70fbbef7b086b063a649e77fab61b0b6b788cfecad52cca" gracePeriod=30 Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.961148 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.961714 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerName="glance-log" containerID="cri-o://5fb878c84cbf3ecf4d3a64a5f76830b0ee03407f0aa830694e62c1f16e0b88a1" gracePeriod=30 Oct 02 16:37:29 crc kubenswrapper[4882]: I1002 16:37:29.961855 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerName="glance-httpd" containerID="cri-o://7c856eebf53a6a819f82a742b4ec639355d2fa7bae41ec826ef21b50e339c276" gracePeriod=30 Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.298043 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-89997"] Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.304539 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-89997"] Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.384550 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pnsg9"] Oct 02 16:37:30 crc kubenswrapper[4882]: E1002 16:37:30.385037 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5002216b-8942-42c3-abbe-9c426fa17da6" containerName="keystone-bootstrap" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.385051 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="5002216b-8942-42c3-abbe-9c426fa17da6" containerName="keystone-bootstrap" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.385307 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="5002216b-8942-42c3-abbe-9c426fa17da6" containerName="keystone-bootstrap" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.386081 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.399324 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pnsg9"] Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.408278 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.408727 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.408900 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.409481 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q8cq5" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.493242 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-fernet-keys\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.493307 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-scripts\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.493347 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-credential-keys\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.493378 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-config-data\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.493464 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-combined-ca-bundle\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.493502 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnklc\" (UniqueName: \"kubernetes.io/projected/00110a10-c070-4648-b80d-33beb4a63b86-kube-api-access-nnklc\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.603317 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-combined-ca-bundle\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.603374 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnklc\" (UniqueName: \"kubernetes.io/projected/00110a10-c070-4648-b80d-33beb4a63b86-kube-api-access-nnklc\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.603471 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-fernet-keys\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.603502 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-scripts\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.603526 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-credential-keys\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.603549 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-config-data\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.611538 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-credential-keys\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.611843 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-config-data\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.611893 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-scripts\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.613092 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-fernet-keys\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.614883 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-combined-ca-bundle\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.628298 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnklc\" (UniqueName: \"kubernetes.io/projected/00110a10-c070-4648-b80d-33beb4a63b86-kube-api-access-nnklc\") pod \"keystone-bootstrap-pnsg9\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.733945 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.779466 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5002216b-8942-42c3-abbe-9c426fa17da6" path="/var/lib/kubelet/pods/5002216b-8942-42c3-abbe-9c426fa17da6/volumes" Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.855913 4882 generic.go:334] "Generic (PLEG): container finished" podID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerID="5fb878c84cbf3ecf4d3a64a5f76830b0ee03407f0aa830694e62c1f16e0b88a1" exitCode=143 Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.867830 4882 generic.go:334] "Generic (PLEG): container finished" podID="585731cf-e419-4806-b770-d468b6360065" containerID="bc1a714e599436e9f70fbbef7b086b063a649e77fab61b0b6b788cfecad52cca" exitCode=0 Oct 02 16:37:30 crc kubenswrapper[4882]: I1002 16:37:30.867871 4882 generic.go:334] "Generic (PLEG): container finished" podID="585731cf-e419-4806-b770-d468b6360065" containerID="691f0b26042b1e93ccf0c11167d837fe96ca5a6bc48134dc8241642e1432945a" exitCode=143 Oct 02 16:37:31 crc kubenswrapper[4882]: I1002 16:37:31.300103 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95770dfd-a95d-496d-8b45-1b08ca9c6a7b","Type":"ContainerDied","Data":"5fb878c84cbf3ecf4d3a64a5f76830b0ee03407f0aa830694e62c1f16e0b88a1"} Oct 02 16:37:31 crc kubenswrapper[4882]: I1002 16:37:31.300183 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"585731cf-e419-4806-b770-d468b6360065","Type":"ContainerDied","Data":"bc1a714e599436e9f70fbbef7b086b063a649e77fab61b0b6b788cfecad52cca"} Oct 02 16:37:31 crc kubenswrapper[4882]: I1002 16:37:31.300202 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"585731cf-e419-4806-b770-d468b6360065","Type":"ContainerDied","Data":"691f0b26042b1e93ccf0c11167d837fe96ca5a6bc48134dc8241642e1432945a"} Oct 02 16:37:31 crc kubenswrapper[4882]: I1002 16:37:31.907777 4882 generic.go:334] "Generic (PLEG): container finished" podID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerID="7c856eebf53a6a819f82a742b4ec639355d2fa7bae41ec826ef21b50e339c276" exitCode=0 Oct 02 16:37:31 crc kubenswrapper[4882]: I1002 16:37:31.907843 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95770dfd-a95d-496d-8b45-1b08ca9c6a7b","Type":"ContainerDied","Data":"7c856eebf53a6a819f82a742b4ec639355d2fa7bae41ec826ef21b50e339c276"} Oct 02 16:37:32 crc kubenswrapper[4882]: I1002 16:37:32.572466 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:37:32 crc kubenswrapper[4882]: I1002 16:37:32.635912 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59775c759f-z44vx"] Oct 02 16:37:32 crc kubenswrapper[4882]: I1002 16:37:32.636565 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59775c759f-z44vx" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="dnsmasq-dns" containerID="cri-o://a7f79ff6b4f0d3f0eeb3caccd7cd76e6e7917496e6971694dbe19c7d4b3edc27" gracePeriod=10 Oct 02 16:37:33 crc kubenswrapper[4882]: I1002 16:37:33.690947 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59775c759f-z44vx" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Oct 02 16:37:33 crc kubenswrapper[4882]: I1002 16:37:33.943525 4882 generic.go:334] "Generic (PLEG): container finished" podID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerID="a7f79ff6b4f0d3f0eeb3caccd7cd76e6e7917496e6971694dbe19c7d4b3edc27" exitCode=0 Oct 02 16:37:33 crc kubenswrapper[4882]: I1002 16:37:33.943580 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59775c759f-z44vx" event={"ID":"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b","Type":"ContainerDied","Data":"a7f79ff6b4f0d3f0eeb3caccd7cd76e6e7917496e6971694dbe19c7d4b3edc27"} Oct 02 16:37:37 crc kubenswrapper[4882]: E1002 16:37:37.279737 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:35c124624fd84930496975032e22d57e517c5958e71ba63124a306a5949c71d4" Oct 02 16:37:37 crc kubenswrapper[4882]: E1002 16:37:37.280114 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:35c124624fd84930496975032e22d57e517c5958e71ba63124a306a5949c71d4,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w64fp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-fcqvb_openstack(1698fd61-b948-4440-a0c2-b4cb3a7f933c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 16:37:37 crc kubenswrapper[4882]: E1002 16:37:37.281466 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-fcqvb" podUID="1698fd61-b948-4440-a0c2-b4cb3a7f933c" Oct 02 16:37:37 crc kubenswrapper[4882]: E1002 16:37:37.989031 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:35c124624fd84930496975032e22d57e517c5958e71ba63124a306a5949c71d4\\\"\"" pod="openstack/placement-db-sync-fcqvb" podUID="1698fd61-b948-4440-a0c2-b4cb3a7f933c" Oct 02 16:37:38 crc kubenswrapper[4882]: I1002 16:37:38.690576 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59775c759f-z44vx" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Oct 02 16:37:39 crc kubenswrapper[4882]: E1002 16:37:39.690689 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:e43273f867316a0e03469d82dc37487d3cdd2b08b4a153ba270c7cae1749bf92" Oct 02 16:37:39 crc kubenswrapper[4882]: E1002 16:37:39.691609 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:e43273f867316a0e03469d82dc37487d3cdd2b08b4a153ba270c7cae1749bf92,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fh546h5c5h576h599h5d7h5f8h574h5fbh5h5b8h5b6h559h589h5f8h75hcch5d7h577h569h678h59h65fh59bhbfh5bdh656h654h677h547h66hfq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kpdkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(0f30a029-08da-4bbe-8d68-6a02148e072c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.768284 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.914821 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-logs\") pod \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.915515 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-logs" (OuterVolumeSpecName: "logs") pod "95770dfd-a95d-496d-8b45-1b08ca9c6a7b" (UID: "95770dfd-a95d-496d-8b45-1b08ca9c6a7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.915666 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.916514 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-combined-ca-bundle\") pod \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.916606 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-httpd-run\") pod \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.916703 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-config-data\") pod \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.916839 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-scripts\") pod \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.916883 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95770dfd-a95d-496d-8b45-1b08ca9c6a7b" (UID: "95770dfd-a95d-496d-8b45-1b08ca9c6a7b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.917119 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpbbd\" (UniqueName: \"kubernetes.io/projected/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-kube-api-access-jpbbd\") pod \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\" (UID: \"95770dfd-a95d-496d-8b45-1b08ca9c6a7b\") " Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.918244 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.918342 4882 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.922406 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "95770dfd-a95d-496d-8b45-1b08ca9c6a7b" (UID: "95770dfd-a95d-496d-8b45-1b08ca9c6a7b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.922422 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-scripts" (OuterVolumeSpecName: "scripts") pod "95770dfd-a95d-496d-8b45-1b08ca9c6a7b" (UID: "95770dfd-a95d-496d-8b45-1b08ca9c6a7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.930565 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-kube-api-access-jpbbd" (OuterVolumeSpecName: "kube-api-access-jpbbd") pod "95770dfd-a95d-496d-8b45-1b08ca9c6a7b" (UID: "95770dfd-a95d-496d-8b45-1b08ca9c6a7b"). InnerVolumeSpecName "kube-api-access-jpbbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.946260 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95770dfd-a95d-496d-8b45-1b08ca9c6a7b" (UID: "95770dfd-a95d-496d-8b45-1b08ca9c6a7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:39 crc kubenswrapper[4882]: I1002 16:37:39.964908 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-config-data" (OuterVolumeSpecName: "config-data") pod "95770dfd-a95d-496d-8b45-1b08ca9c6a7b" (UID: "95770dfd-a95d-496d-8b45-1b08ca9c6a7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.012223 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"95770dfd-a95d-496d-8b45-1b08ca9c6a7b","Type":"ContainerDied","Data":"457f1c65d6a5de1107af5731b7893aeaebfebcd362f1948209387c8b64b72157"} Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.012281 4882 scope.go:117] "RemoveContainer" containerID="7c856eebf53a6a819f82a742b4ec639355d2fa7bae41ec826ef21b50e339c276" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.012364 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.020318 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.020356 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.020372 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.020387 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.020397 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpbbd\" (UniqueName: \"kubernetes.io/projected/95770dfd-a95d-496d-8b45-1b08ca9c6a7b-kube-api-access-jpbbd\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.067634 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.081204 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.089639 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.118027 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:40 crc kubenswrapper[4882]: E1002 16:37:40.118681 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerName="glance-httpd" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.118695 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerName="glance-httpd" Oct 02 16:37:40 crc kubenswrapper[4882]: E1002 16:37:40.118729 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerName="glance-log" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.118736 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerName="glance-log" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.121359 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerName="glance-log" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.121507 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" containerName="glance-httpd" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.124763 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.127853 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.129397 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.129628 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.131872 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.231285 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.231345 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-logs\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.231390 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.231426 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tt48\" (UniqueName: \"kubernetes.io/projected/65607d96-f458-46ff-b0e9-b4a3cd818657-kube-api-access-4tt48\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.231469 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.231492 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.231522 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.231644 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.333586 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.333649 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tt48\" (UniqueName: \"kubernetes.io/projected/65607d96-f458-46ff-b0e9-b4a3cd818657-kube-api-access-4tt48\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.333698 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.333725 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.333754 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.333776 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.333801 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.333841 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-logs\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.334058 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.334295 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-logs\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.334551 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.339633 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.339793 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.350981 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.361120 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tt48\" (UniqueName: \"kubernetes.io/projected/65607d96-f458-46ff-b0e9-b4a3cd818657-kube-api-access-4tt48\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.361499 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.364029 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.448201 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:37:40 crc kubenswrapper[4882]: I1002 16:37:40.772857 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95770dfd-a95d-496d-8b45-1b08ca9c6a7b" path="/var/lib/kubelet/pods/95770dfd-a95d-496d-8b45-1b08ca9c6a7b/volumes" Oct 02 16:37:48 crc kubenswrapper[4882]: I1002 16:37:48.691307 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59775c759f-z44vx" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Oct 02 16:37:48 crc kubenswrapper[4882]: I1002 16:37:48.692149 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:37:49 crc kubenswrapper[4882]: E1002 16:37:49.167192 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:9e14abeaab473b6731830d9c5bf383bb52111c919c787aee06b833f8cd3f83b1" Oct 02 16:37:49 crc kubenswrapper[4882]: E1002 16:37:49.168678 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:9e14abeaab473b6731830d9c5bf383bb52111c919c787aee06b833f8cd3f83b1,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vfjj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-k42n2_openstack(68a0980b-843c-4fdd-a638-fe28f7bf4491): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 16:37:49 crc kubenswrapper[4882]: E1002 16:37:49.169985 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-k42n2" podUID="68a0980b-843c-4fdd-a638-fe28f7bf4491" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.250320 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.260207 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.404752 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc69v\" (UniqueName: \"kubernetes.io/projected/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-kube-api-access-hc69v\") pod \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.404801 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-config-data\") pod \"585731cf-e419-4806-b770-d468b6360065\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.404864 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-dns-svc\") pod \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.404921 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-combined-ca-bundle\") pod \"585731cf-e419-4806-b770-d468b6360065\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.404941 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-httpd-run\") pod \"585731cf-e419-4806-b770-d468b6360065\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.404995 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-sb\") pod \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.405041 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-nb\") pod \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.405060 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"585731cf-e419-4806-b770-d468b6360065\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.405093 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjr9d\" (UniqueName: \"kubernetes.io/projected/585731cf-e419-4806-b770-d468b6360065-kube-api-access-tjr9d\") pod \"585731cf-e419-4806-b770-d468b6360065\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.405121 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-config\") pod \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\" (UID: \"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.405157 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-scripts\") pod \"585731cf-e419-4806-b770-d468b6360065\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.405180 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-logs\") pod \"585731cf-e419-4806-b770-d468b6360065\" (UID: \"585731cf-e419-4806-b770-d468b6360065\") " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.406536 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-logs" (OuterVolumeSpecName: "logs") pod "585731cf-e419-4806-b770-d468b6360065" (UID: "585731cf-e419-4806-b770-d468b6360065"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.407886 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "585731cf-e419-4806-b770-d468b6360065" (UID: "585731cf-e419-4806-b770-d468b6360065"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.412915 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "585731cf-e419-4806-b770-d468b6360065" (UID: "585731cf-e419-4806-b770-d468b6360065"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.413471 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585731cf-e419-4806-b770-d468b6360065-kube-api-access-tjr9d" (OuterVolumeSpecName: "kube-api-access-tjr9d") pod "585731cf-e419-4806-b770-d468b6360065" (UID: "585731cf-e419-4806-b770-d468b6360065"). InnerVolumeSpecName "kube-api-access-tjr9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.413608 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-kube-api-access-hc69v" (OuterVolumeSpecName: "kube-api-access-hc69v") pod "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" (UID: "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b"). InnerVolumeSpecName "kube-api-access-hc69v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.413897 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-scripts" (OuterVolumeSpecName: "scripts") pod "585731cf-e419-4806-b770-d468b6360065" (UID: "585731cf-e419-4806-b770-d468b6360065"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.438513 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "585731cf-e419-4806-b770-d468b6360065" (UID: "585731cf-e419-4806-b770-d468b6360065"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.458523 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" (UID: "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.459325 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" (UID: "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.462905 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-config" (OuterVolumeSpecName: "config") pod "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" (UID: "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.470537 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-config-data" (OuterVolumeSpecName: "config-data") pod "585731cf-e419-4806-b770-d468b6360065" (UID: "585731cf-e419-4806-b770-d468b6360065"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.474068 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" (UID: "16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507547 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507592 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507608 4882 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507621 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507635 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507675 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507687 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjr9d\" (UniqueName: \"kubernetes.io/projected/585731cf-e419-4806-b770-d468b6360065-kube-api-access-tjr9d\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507697 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507705 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507713 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585731cf-e419-4806-b770-d468b6360065-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507721 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc69v\" (UniqueName: \"kubernetes.io/projected/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b-kube-api-access-hc69v\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.507729 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585731cf-e419-4806-b770-d468b6360065-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.531349 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 16:37:49 crc kubenswrapper[4882]: I1002 16:37:49.609677 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.108452 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59775c759f-z44vx" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.108443 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59775c759f-z44vx" event={"ID":"16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b","Type":"ContainerDied","Data":"501e5f43c1802e62ab6e7033f8484981b2c719ab78378961cfb0bd8dca134a9b"} Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.113907 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.116384 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"585731cf-e419-4806-b770-d468b6360065","Type":"ContainerDied","Data":"893db928e41b1c820d7c429ec9571066a1b497f668a3093b576cb7a8cf52fa7c"} Oct 02 16:37:50 crc kubenswrapper[4882]: E1002 16:37:50.117356 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:9e14abeaab473b6731830d9c5bf383bb52111c919c787aee06b833f8cd3f83b1\\\"\"" pod="openstack/barbican-db-sync-k42n2" podUID="68a0980b-843c-4fdd-a638-fe28f7bf4491" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.171556 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59775c759f-z44vx"] Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.185772 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59775c759f-z44vx"] Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.193756 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.214694 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.229509 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:50 crc kubenswrapper[4882]: E1002 16:37:50.229895 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585731cf-e419-4806-b770-d468b6360065" containerName="glance-httpd" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.229908 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="585731cf-e419-4806-b770-d468b6360065" containerName="glance-httpd" Oct 02 16:37:50 crc kubenswrapper[4882]: E1002 16:37:50.229929 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="init" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.229936 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="init" Oct 02 16:37:50 crc kubenswrapper[4882]: E1002 16:37:50.229945 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585731cf-e419-4806-b770-d468b6360065" containerName="glance-log" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.229951 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="585731cf-e419-4806-b770-d468b6360065" containerName="glance-log" Oct 02 16:37:50 crc kubenswrapper[4882]: E1002 16:37:50.229972 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="dnsmasq-dns" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.229979 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="dnsmasq-dns" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.230134 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="dnsmasq-dns" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.230147 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="585731cf-e419-4806-b770-d468b6360065" containerName="glance-httpd" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.230163 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="585731cf-e419-4806-b770-d468b6360065" containerName="glance-log" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.231160 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.236766 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.236936 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.237496 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.326137 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.327730 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-logs\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.327842 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.327958 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.328079 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmrm6\" (UniqueName: \"kubernetes.io/projected/9f6d6450-e645-4a76-b0cb-76567cf1307c-kube-api-access-hmrm6\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.328299 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.328407 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.328519 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.429808 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.429879 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-logs\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.429903 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.429944 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.429990 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmrm6\" (UniqueName: \"kubernetes.io/projected/9f6d6450-e645-4a76-b0cb-76567cf1307c-kube-api-access-hmrm6\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.430043 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.430081 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.430118 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.430439 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.432722 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-logs\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.433331 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.439076 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.439171 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.439754 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.454111 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.454700 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmrm6\" (UniqueName: \"kubernetes.io/projected/9f6d6450-e645-4a76-b0cb-76567cf1307c-kube-api-access-hmrm6\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.461603 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.562306 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.772161 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" path="/var/lib/kubelet/pods/16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b/volumes" Oct 02 16:37:50 crc kubenswrapper[4882]: I1002 16:37:50.772934 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585731cf-e419-4806-b770-d468b6360065" path="/var/lib/kubelet/pods/585731cf-e419-4806-b770-d468b6360065/volumes" Oct 02 16:37:51 crc kubenswrapper[4882]: I1002 16:37:51.455023 4882 scope.go:117] "RemoveContainer" containerID="5fb878c84cbf3ecf4d3a64a5f76830b0ee03407f0aa830694e62c1f16e0b88a1" Oct 02 16:37:51 crc kubenswrapper[4882]: E1002 16:37:51.479912 4882 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166" Oct 02 16:37:51 crc kubenswrapper[4882]: E1002 16:37:51.480173 4882 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwrkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-spmcq_openstack(39804fe6-5476-4f26-a743-83c1852229ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 16:37:51 crc kubenswrapper[4882]: E1002 16:37:51.481379 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-spmcq" podUID="39804fe6-5476-4f26-a743-83c1852229ec" Oct 02 16:37:51 crc kubenswrapper[4882]: I1002 16:37:51.918805 4882 scope.go:117] "RemoveContainer" containerID="a7f79ff6b4f0d3f0eeb3caccd7cd76e6e7917496e6971694dbe19c7d4b3edc27" Oct 02 16:37:51 crc kubenswrapper[4882]: I1002 16:37:51.934477 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pnsg9"] Oct 02 16:37:51 crc kubenswrapper[4882]: I1002 16:37:51.955083 4882 scope.go:117] "RemoveContainer" containerID="8eb0dfd758bae7ec26875ce74450792d723e471e7f925fd15d77e95aace4a091" Oct 02 16:37:52 crc kubenswrapper[4882]: I1002 16:37:52.110332 4882 scope.go:117] "RemoveContainer" containerID="bc1a714e599436e9f70fbbef7b086b063a649e77fab61b0b6b788cfecad52cca" Oct 02 16:37:52 crc kubenswrapper[4882]: I1002 16:37:52.129622 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:37:52 crc kubenswrapper[4882]: I1002 16:37:52.139406 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pnsg9" event={"ID":"00110a10-c070-4648-b80d-33beb4a63b86","Type":"ContainerStarted","Data":"deebae8f68eb0b0293b0020d20e65516d15325ea1dc8c628c5a96024654b3855"} Oct 02 16:37:52 crc kubenswrapper[4882]: E1002 16:37:52.147777 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166\\\"\"" pod="openstack/cinder-db-sync-spmcq" podUID="39804fe6-5476-4f26-a743-83c1852229ec" Oct 02 16:37:52 crc kubenswrapper[4882]: W1002 16:37:52.148484 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65607d96_f458_46ff_b0e9_b4a3cd818657.slice/crio-8295b8fdf13817801a83e0e3949a38d9ae4397f892a84ca55493360babdc997f WatchSource:0}: Error finding container 8295b8fdf13817801a83e0e3949a38d9ae4397f892a84ca55493360babdc997f: Status 404 returned error can't find the container with id 8295b8fdf13817801a83e0e3949a38d9ae4397f892a84ca55493360babdc997f Oct 02 16:37:52 crc kubenswrapper[4882]: I1002 16:37:52.172849 4882 scope.go:117] "RemoveContainer" containerID="691f0b26042b1e93ccf0c11167d837fe96ca5a6bc48134dc8241642e1432945a" Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.162995 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f30a029-08da-4bbe-8d68-6a02148e072c","Type":"ContainerStarted","Data":"f81ad1b31da85492aee00a390eeeaf1ec2977a5b45ffe9c0c865b0263375acbd"} Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.165061 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65607d96-f458-46ff-b0e9-b4a3cd818657","Type":"ContainerStarted","Data":"aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5"} Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.165088 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65607d96-f458-46ff-b0e9-b4a3cd818657","Type":"ContainerStarted","Data":"8295b8fdf13817801a83e0e3949a38d9ae4397f892a84ca55493360babdc997f"} Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.184199 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.186392 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pnsg9" event={"ID":"00110a10-c070-4648-b80d-33beb4a63b86","Type":"ContainerStarted","Data":"ba198663776df5bcfbbe3aad9beb1270b80995557ab603838a2e4d86db558ad2"} Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.191289 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fcqvb" event={"ID":"1698fd61-b948-4440-a0c2-b4cb3a7f933c","Type":"ContainerStarted","Data":"cc8da07ebf41240f0bc3ca44a1ba3b9cdde1ddc0293e1d6702b909eedaa87c97"} Oct 02 16:37:53 crc kubenswrapper[4882]: W1002 16:37:53.202118 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f6d6450_e645_4a76_b0cb_76567cf1307c.slice/crio-f9a7ebb617c7f2f7b177a0be82c00ec77d2d5649a02b835ac9a283f976933827 WatchSource:0}: Error finding container f9a7ebb617c7f2f7b177a0be82c00ec77d2d5649a02b835ac9a283f976933827: Status 404 returned error can't find the container with id f9a7ebb617c7f2f7b177a0be82c00ec77d2d5649a02b835ac9a283f976933827 Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.236082 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fcqvb" podStartSLOduration=3.042375535 podStartE2EDuration="34.236063486s" podCreationTimestamp="2025-10-02 16:37:19 +0000 UTC" firstStartedPulling="2025-10-02 16:37:21.084297907 +0000 UTC m=+1199.833527434" lastFinishedPulling="2025-10-02 16:37:52.277985858 +0000 UTC m=+1231.027215385" observedRunningTime="2025-10-02 16:37:53.235240796 +0000 UTC m=+1231.984470323" watchObservedRunningTime="2025-10-02 16:37:53.236063486 +0000 UTC m=+1231.985293003" Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.248440 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pnsg9" podStartSLOduration=23.248415179 podStartE2EDuration="23.248415179s" podCreationTimestamp="2025-10-02 16:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:53.214501111 +0000 UTC m=+1231.963730638" watchObservedRunningTime="2025-10-02 16:37:53.248415179 +0000 UTC m=+1231.997644706" Oct 02 16:37:53 crc kubenswrapper[4882]: I1002 16:37:53.692757 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59775c759f-z44vx" podUID="16d7f9d5-95d8-497b-8877-3f4c9e1dfc3b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Oct 02 16:37:54 crc kubenswrapper[4882]: I1002 16:37:54.204669 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f6d6450-e645-4a76-b0cb-76567cf1307c","Type":"ContainerStarted","Data":"52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c"} Oct 02 16:37:54 crc kubenswrapper[4882]: I1002 16:37:54.205195 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f6d6450-e645-4a76-b0cb-76567cf1307c","Type":"ContainerStarted","Data":"f9a7ebb617c7f2f7b177a0be82c00ec77d2d5649a02b835ac9a283f976933827"} Oct 02 16:37:54 crc kubenswrapper[4882]: I1002 16:37:54.210446 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65607d96-f458-46ff-b0e9-b4a3cd818657","Type":"ContainerStarted","Data":"3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd"} Oct 02 16:37:54 crc kubenswrapper[4882]: I1002 16:37:54.242137 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.242112418 podStartE2EDuration="14.242112418s" podCreationTimestamp="2025-10-02 16:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:54.233663884 +0000 UTC m=+1232.982893441" watchObservedRunningTime="2025-10-02 16:37:54.242112418 +0000 UTC m=+1232.991341945" Oct 02 16:37:55 crc kubenswrapper[4882]: I1002 16:37:55.221615 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f6d6450-e645-4a76-b0cb-76567cf1307c","Type":"ContainerStarted","Data":"155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9"} Oct 02 16:37:55 crc kubenswrapper[4882]: I1002 16:37:55.245834 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.24580594 podStartE2EDuration="5.24580594s" podCreationTimestamp="2025-10-02 16:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:37:55.24070559 +0000 UTC m=+1233.989935117" watchObservedRunningTime="2025-10-02 16:37:55.24580594 +0000 UTC m=+1233.995035467" Oct 02 16:37:58 crc kubenswrapper[4882]: I1002 16:37:58.247907 4882 generic.go:334] "Generic (PLEG): container finished" podID="00110a10-c070-4648-b80d-33beb4a63b86" containerID="ba198663776df5bcfbbe3aad9beb1270b80995557ab603838a2e4d86db558ad2" exitCode=0 Oct 02 16:37:58 crc kubenswrapper[4882]: I1002 16:37:58.248159 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pnsg9" event={"ID":"00110a10-c070-4648-b80d-33beb4a63b86","Type":"ContainerDied","Data":"ba198663776df5bcfbbe3aad9beb1270b80995557ab603838a2e4d86db558ad2"} Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.263502 4882 generic.go:334] "Generic (PLEG): container finished" podID="1698fd61-b948-4440-a0c2-b4cb3a7f933c" containerID="cc8da07ebf41240f0bc3ca44a1ba3b9cdde1ddc0293e1d6702b909eedaa87c97" exitCode=0 Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.264104 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fcqvb" event={"ID":"1698fd61-b948-4440-a0c2-b4cb3a7f933c","Type":"ContainerDied","Data":"cc8da07ebf41240f0bc3ca44a1ba3b9cdde1ddc0293e1d6702b909eedaa87c97"} Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.270088 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f30a029-08da-4bbe-8d68-6a02148e072c","Type":"ContainerStarted","Data":"31ce9aefe555c13850ce04c42dbc71bf18ceeaa39ebe012a02500b2d5badf800"} Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.663603 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.724412 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-credential-keys\") pod \"00110a10-c070-4648-b80d-33beb4a63b86\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.724477 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-scripts\") pod \"00110a10-c070-4648-b80d-33beb4a63b86\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.724527 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnklc\" (UniqueName: \"kubernetes.io/projected/00110a10-c070-4648-b80d-33beb4a63b86-kube-api-access-nnklc\") pod \"00110a10-c070-4648-b80d-33beb4a63b86\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.724662 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-config-data\") pod \"00110a10-c070-4648-b80d-33beb4a63b86\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.724717 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-combined-ca-bundle\") pod \"00110a10-c070-4648-b80d-33beb4a63b86\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.724753 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-fernet-keys\") pod \"00110a10-c070-4648-b80d-33beb4a63b86\" (UID: \"00110a10-c070-4648-b80d-33beb4a63b86\") " Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.731138 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-scripts" (OuterVolumeSpecName: "scripts") pod "00110a10-c070-4648-b80d-33beb4a63b86" (UID: "00110a10-c070-4648-b80d-33beb4a63b86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.731537 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "00110a10-c070-4648-b80d-33beb4a63b86" (UID: "00110a10-c070-4648-b80d-33beb4a63b86"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.731926 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "00110a10-c070-4648-b80d-33beb4a63b86" (UID: "00110a10-c070-4648-b80d-33beb4a63b86"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.732691 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00110a10-c070-4648-b80d-33beb4a63b86-kube-api-access-nnklc" (OuterVolumeSpecName: "kube-api-access-nnklc") pod "00110a10-c070-4648-b80d-33beb4a63b86" (UID: "00110a10-c070-4648-b80d-33beb4a63b86"). InnerVolumeSpecName "kube-api-access-nnklc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.760139 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-config-data" (OuterVolumeSpecName: "config-data") pod "00110a10-c070-4648-b80d-33beb4a63b86" (UID: "00110a10-c070-4648-b80d-33beb4a63b86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.767194 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00110a10-c070-4648-b80d-33beb4a63b86" (UID: "00110a10-c070-4648-b80d-33beb4a63b86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.827624 4882 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.827673 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.827689 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnklc\" (UniqueName: \"kubernetes.io/projected/00110a10-c070-4648-b80d-33beb4a63b86-kube-api-access-nnklc\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.827705 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.827731 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:37:59 crc kubenswrapper[4882]: I1002 16:37:59.827745 4882 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00110a10-c070-4648-b80d-33beb4a63b86-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.280268 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pnsg9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.280284 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pnsg9" event={"ID":"00110a10-c070-4648-b80d-33beb4a63b86","Type":"ContainerDied","Data":"deebae8f68eb0b0293b0020d20e65516d15325ea1dc8c628c5a96024654b3855"} Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.280653 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deebae8f68eb0b0293b0020d20e65516d15325ea1dc8c628c5a96024654b3855" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.434047 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7df8f99cc4-7tfd9"] Oct 02 16:38:00 crc kubenswrapper[4882]: E1002 16:38:00.436186 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00110a10-c070-4648-b80d-33beb4a63b86" containerName="keystone-bootstrap" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.436266 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="00110a10-c070-4648-b80d-33beb4a63b86" containerName="keystone-bootstrap" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.436812 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="00110a10-c070-4648-b80d-33beb4a63b86" containerName="keystone-bootstrap" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.438191 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.444595 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.444904 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.445064 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.445201 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q8cq5" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.445415 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.446187 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.449411 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.450251 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.476641 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7df8f99cc4-7tfd9"] Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.531858 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.578672 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.582327 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-scripts\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.582413 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-credential-keys\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.582438 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrcs\" (UniqueName: \"kubernetes.io/projected/d5be6998-0b42-475a-8418-032327087ace-kube-api-access-mrrcs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.582504 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-config-data\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.582636 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-combined-ca-bundle\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.582723 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-internal-tls-certs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.582753 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-fernet-keys\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.582833 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-public-tls-certs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.584671 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.599736 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.657461 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.694168 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrcs\" (UniqueName: \"kubernetes.io/projected/d5be6998-0b42-475a-8418-032327087ace-kube-api-access-mrrcs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.694429 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-config-data\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.694546 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-combined-ca-bundle\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.694628 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-internal-tls-certs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.694738 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-fernet-keys\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.694854 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-public-tls-certs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.694980 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-scripts\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.695062 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-credential-keys\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.700173 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.706892 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-public-tls-certs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.719652 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-scripts\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.720309 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-fernet-keys\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.720532 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-internal-tls-certs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.722854 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-credential-keys\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.731880 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrcs\" (UniqueName: \"kubernetes.io/projected/d5be6998-0b42-475a-8418-032327087ace-kube-api-access-mrrcs\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.734097 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-combined-ca-bundle\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.735111 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-config-data\") pod \"keystone-7df8f99cc4-7tfd9\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.799156 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.823807 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fcqvb" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.899285 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1698fd61-b948-4440-a0c2-b4cb3a7f933c-logs\") pod \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.899408 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-combined-ca-bundle\") pod \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.899536 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-scripts\") pod \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.899574 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w64fp\" (UniqueName: \"kubernetes.io/projected/1698fd61-b948-4440-a0c2-b4cb3a7f933c-kube-api-access-w64fp\") pod \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.899622 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-config-data\") pod \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\" (UID: \"1698fd61-b948-4440-a0c2-b4cb3a7f933c\") " Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.900669 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1698fd61-b948-4440-a0c2-b4cb3a7f933c-logs" (OuterVolumeSpecName: "logs") pod "1698fd61-b948-4440-a0c2-b4cb3a7f933c" (UID: "1698fd61-b948-4440-a0c2-b4cb3a7f933c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.910384 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-scripts" (OuterVolumeSpecName: "scripts") pod "1698fd61-b948-4440-a0c2-b4cb3a7f933c" (UID: "1698fd61-b948-4440-a0c2-b4cb3a7f933c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.935382 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1698fd61-b948-4440-a0c2-b4cb3a7f933c-kube-api-access-w64fp" (OuterVolumeSpecName: "kube-api-access-w64fp") pod "1698fd61-b948-4440-a0c2-b4cb3a7f933c" (UID: "1698fd61-b948-4440-a0c2-b4cb3a7f933c"). InnerVolumeSpecName "kube-api-access-w64fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:00 crc kubenswrapper[4882]: I1002 16:38:00.951614 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-config-data" (OuterVolumeSpecName: "config-data") pod "1698fd61-b948-4440-a0c2-b4cb3a7f933c" (UID: "1698fd61-b948-4440-a0c2-b4cb3a7f933c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.003004 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.003072 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1698fd61-b948-4440-a0c2-b4cb3a7f933c-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.003085 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.003098 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w64fp\" (UniqueName: \"kubernetes.io/projected/1698fd61-b948-4440-a0c2-b4cb3a7f933c-kube-api-access-w64fp\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.011474 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1698fd61-b948-4440-a0c2-b4cb3a7f933c" (UID: "1698fd61-b948-4440-a0c2-b4cb3a7f933c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.104401 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1698fd61-b948-4440-a0c2-b4cb3a7f933c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.292944 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fcqvb" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.293032 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fcqvb" event={"ID":"1698fd61-b948-4440-a0c2-b4cb3a7f933c","Type":"ContainerDied","Data":"1733c221706c1533481b08cb762880058a1f4a8d0ccbf96ce79623a738c6fe7a"} Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.293076 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1733c221706c1533481b08cb762880058a1f4a8d0ccbf96ce79623a738c6fe7a" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.293790 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.293890 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.297123 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.297249 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.389110 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7df8f99cc4-7tfd9"] Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.422789 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-568599c566-7t7js"] Oct 02 16:38:01 crc kubenswrapper[4882]: E1002 16:38:01.428666 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1698fd61-b948-4440-a0c2-b4cb3a7f933c" containerName="placement-db-sync" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.428727 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="1698fd61-b948-4440-a0c2-b4cb3a7f933c" containerName="placement-db-sync" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.428966 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="1698fd61-b948-4440-a0c2-b4cb3a7f933c" containerName="placement-db-sync" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.430073 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.437703 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.437989 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.438155 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.438313 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.438455 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5v96v" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.457160 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-568599c566-7t7js"] Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.518646 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-combined-ca-bundle\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.518692 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-scripts\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.518768 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-internal-tls-certs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.518787 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-public-tls-certs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.519304 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-config-data\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.519349 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5nbp\" (UniqueName: \"kubernetes.io/projected/6537afcb-4015-45f6-bdb5-68e0625c6ea6-kube-api-access-c5nbp\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.519382 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6537afcb-4015-45f6-bdb5-68e0625c6ea6-logs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.621098 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-config-data\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.621150 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5nbp\" (UniqueName: \"kubernetes.io/projected/6537afcb-4015-45f6-bdb5-68e0625c6ea6-kube-api-access-c5nbp\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.621182 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6537afcb-4015-45f6-bdb5-68e0625c6ea6-logs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.621241 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-combined-ca-bundle\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.621262 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-scripts\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.621326 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-internal-tls-certs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.621343 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-public-tls-certs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.621915 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6537afcb-4015-45f6-bdb5-68e0625c6ea6-logs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.629198 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-public-tls-certs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.632692 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-scripts\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.633759 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-config-data\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.637347 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-internal-tls-certs\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.640942 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-combined-ca-bundle\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.642511 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5nbp\" (UniqueName: \"kubernetes.io/projected/6537afcb-4015-45f6-bdb5-68e0625c6ea6-kube-api-access-c5nbp\") pod \"placement-568599c566-7t7js\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:01 crc kubenswrapper[4882]: I1002 16:38:01.773404 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:02 crc kubenswrapper[4882]: I1002 16:38:02.308522 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7df8f99cc4-7tfd9" event={"ID":"d5be6998-0b42-475a-8418-032327087ace","Type":"ContainerStarted","Data":"02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398"} Oct 02 16:38:02 crc kubenswrapper[4882]: I1002 16:38:02.309443 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7df8f99cc4-7tfd9" event={"ID":"d5be6998-0b42-475a-8418-032327087ace","Type":"ContainerStarted","Data":"944ba7b88bc462c3ad42cc6d9d42d3f227e589ace3a3294c22873e53a3517968"} Oct 02 16:38:02 crc kubenswrapper[4882]: I1002 16:38:02.370053 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7df8f99cc4-7tfd9" podStartSLOduration=2.370026119 podStartE2EDuration="2.370026119s" podCreationTimestamp="2025-10-02 16:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:02.342270567 +0000 UTC m=+1241.091500094" watchObservedRunningTime="2025-10-02 16:38:02.370026119 +0000 UTC m=+1241.119255646" Oct 02 16:38:02 crc kubenswrapper[4882]: I1002 16:38:02.380437 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-568599c566-7t7js"] Oct 02 16:38:03 crc kubenswrapper[4882]: I1002 16:38:03.384288 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-568599c566-7t7js" event={"ID":"6537afcb-4015-45f6-bdb5-68e0625c6ea6","Type":"ContainerStarted","Data":"9e298188f83c2dcdfd495c92ac971fa8c62421680bfe7edca5990f87e39223ce"} Oct 02 16:38:03 crc kubenswrapper[4882]: I1002 16:38:03.384650 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-568599c566-7t7js" event={"ID":"6537afcb-4015-45f6-bdb5-68e0625c6ea6","Type":"ContainerStarted","Data":"f1f9da0a3b102e508fb5fdb3b339242c8671cf3158ae59f00c1168bd693b7754"} Oct 02 16:38:03 crc kubenswrapper[4882]: I1002 16:38:03.384674 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:03 crc kubenswrapper[4882]: I1002 16:38:03.477709 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 16:38:03 crc kubenswrapper[4882]: I1002 16:38:03.478120 4882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 16:38:03 crc kubenswrapper[4882]: I1002 16:38:03.480571 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 16:38:03 crc kubenswrapper[4882]: I1002 16:38:03.758319 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 16:38:03 crc kubenswrapper[4882]: I1002 16:38:03.758444 4882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 16:38:04 crc kubenswrapper[4882]: I1002 16:38:04.348647 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 16:38:04 crc kubenswrapper[4882]: I1002 16:38:04.402962 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-568599c566-7t7js" event={"ID":"6537afcb-4015-45f6-bdb5-68e0625c6ea6","Type":"ContainerStarted","Data":"c61dc991f9c47938a3b335bd886b13edcae91852caa89730fe80338f674f2a36"} Oct 02 16:38:04 crc kubenswrapper[4882]: I1002 16:38:04.403886 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:04 crc kubenswrapper[4882]: I1002 16:38:04.403925 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:04 crc kubenswrapper[4882]: I1002 16:38:04.434193 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-568599c566-7t7js" podStartSLOduration=3.434169217 podStartE2EDuration="3.434169217s" podCreationTimestamp="2025-10-02 16:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:04.432417123 +0000 UTC m=+1243.181646670" watchObservedRunningTime="2025-10-02 16:38:04.434169217 +0000 UTC m=+1243.183398744" Oct 02 16:38:10 crc kubenswrapper[4882]: E1002 16:38:10.536734 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.483506 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f30a029-08da-4bbe-8d68-6a02148e072c","Type":"ContainerStarted","Data":"6165d613d1a7aef5ed971f11979a462c24039c6378ae7708275350821777e5e6"} Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.483614 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="ceilometer-notification-agent" containerID="cri-o://f81ad1b31da85492aee00a390eeeaf1ec2977a5b45ffe9c0c865b0263375acbd" gracePeriod=30 Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.483673 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="proxy-httpd" containerID="cri-o://6165d613d1a7aef5ed971f11979a462c24039c6378ae7708275350821777e5e6" gracePeriod=30 Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.483776 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="sg-core" containerID="cri-o://31ce9aefe555c13850ce04c42dbc71bf18ceeaa39ebe012a02500b2d5badf800" gracePeriod=30 Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.484006 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.486999 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-spmcq" event={"ID":"39804fe6-5476-4f26-a743-83c1852229ec","Type":"ContainerStarted","Data":"0a8b48d27f4da685ade95ade8fb027b77858817f1978dd7e0a894eef8c0afdff"} Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.492151 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k42n2" event={"ID":"68a0980b-843c-4fdd-a638-fe28f7bf4491","Type":"ContainerStarted","Data":"929fdba2157f4a815e2411de0bf84cb77c8d2439b65e441725cfba13acde8d5c"} Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.543501 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-spmcq" podStartSLOduration=3.63896802 podStartE2EDuration="51.543474049s" podCreationTimestamp="2025-10-02 16:37:20 +0000 UTC" firstStartedPulling="2025-10-02 16:37:22.404086522 +0000 UTC m=+1201.153316049" lastFinishedPulling="2025-10-02 16:38:10.308592551 +0000 UTC m=+1249.057822078" observedRunningTime="2025-10-02 16:38:11.537397514 +0000 UTC m=+1250.286627041" watchObservedRunningTime="2025-10-02 16:38:11.543474049 +0000 UTC m=+1250.292703576" Oct 02 16:38:11 crc kubenswrapper[4882]: I1002 16:38:11.570190 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-k42n2" podStartSLOduration=3.327015539 podStartE2EDuration="50.570169344s" podCreationTimestamp="2025-10-02 16:37:21 +0000 UTC" firstStartedPulling="2025-10-02 16:37:22.84061013 +0000 UTC m=+1201.589839657" lastFinishedPulling="2025-10-02 16:38:10.083763925 +0000 UTC m=+1248.832993462" observedRunningTime="2025-10-02 16:38:11.562062499 +0000 UTC m=+1250.311292026" watchObservedRunningTime="2025-10-02 16:38:11.570169344 +0000 UTC m=+1250.319398861" Oct 02 16:38:12 crc kubenswrapper[4882]: I1002 16:38:12.522365 4882 generic.go:334] "Generic (PLEG): container finished" podID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerID="6165d613d1a7aef5ed971f11979a462c24039c6378ae7708275350821777e5e6" exitCode=0 Oct 02 16:38:12 crc kubenswrapper[4882]: I1002 16:38:12.523157 4882 generic.go:334] "Generic (PLEG): container finished" podID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerID="31ce9aefe555c13850ce04c42dbc71bf18ceeaa39ebe012a02500b2d5badf800" exitCode=2 Oct 02 16:38:12 crc kubenswrapper[4882]: I1002 16:38:12.522455 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f30a029-08da-4bbe-8d68-6a02148e072c","Type":"ContainerDied","Data":"6165d613d1a7aef5ed971f11979a462c24039c6378ae7708275350821777e5e6"} Oct 02 16:38:12 crc kubenswrapper[4882]: I1002 16:38:12.523248 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f30a029-08da-4bbe-8d68-6a02148e072c","Type":"ContainerDied","Data":"31ce9aefe555c13850ce04c42dbc71bf18ceeaa39ebe012a02500b2d5badf800"} Oct 02 16:38:14 crc kubenswrapper[4882]: I1002 16:38:14.545196 4882 generic.go:334] "Generic (PLEG): container finished" podID="ef8e6711-063d-42f2-afb6-426e7617d062" containerID="49daefd74c00c9ed3bfcfe52dc5591e1096340894023627744e4282783dddf06" exitCode=0 Oct 02 16:38:14 crc kubenswrapper[4882]: I1002 16:38:14.545275 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8rr77" event={"ID":"ef8e6711-063d-42f2-afb6-426e7617d062","Type":"ContainerDied","Data":"49daefd74c00c9ed3bfcfe52dc5591e1096340894023627744e4282783dddf06"} Oct 02 16:38:15 crc kubenswrapper[4882]: I1002 16:38:15.578171 4882 generic.go:334] "Generic (PLEG): container finished" podID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerID="f81ad1b31da85492aee00a390eeeaf1ec2977a5b45ffe9c0c865b0263375acbd" exitCode=0 Oct 02 16:38:15 crc kubenswrapper[4882]: I1002 16:38:15.578248 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f30a029-08da-4bbe-8d68-6a02148e072c","Type":"ContainerDied","Data":"f81ad1b31da85492aee00a390eeeaf1ec2977a5b45ffe9c0c865b0263375acbd"} Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.000850 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.008006 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8rr77" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033306 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-scripts\") pod \"0f30a029-08da-4bbe-8d68-6a02148e072c\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033370 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-run-httpd\") pod \"0f30a029-08da-4bbe-8d68-6a02148e072c\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033405 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-config-data\") pod \"0f30a029-08da-4bbe-8d68-6a02148e072c\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033432 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-combined-ca-bundle\") pod \"ef8e6711-063d-42f2-afb6-426e7617d062\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033460 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-combined-ca-bundle\") pod \"0f30a029-08da-4bbe-8d68-6a02148e072c\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033528 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-config\") pod \"ef8e6711-063d-42f2-afb6-426e7617d062\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033591 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvzs7\" (UniqueName: \"kubernetes.io/projected/ef8e6711-063d-42f2-afb6-426e7617d062-kube-api-access-fvzs7\") pod \"ef8e6711-063d-42f2-afb6-426e7617d062\" (UID: \"ef8e6711-063d-42f2-afb6-426e7617d062\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033636 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpdkv\" (UniqueName: \"kubernetes.io/projected/0f30a029-08da-4bbe-8d68-6a02148e072c-kube-api-access-kpdkv\") pod \"0f30a029-08da-4bbe-8d68-6a02148e072c\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033677 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-log-httpd\") pod \"0f30a029-08da-4bbe-8d68-6a02148e072c\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033726 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-sg-core-conf-yaml\") pod \"0f30a029-08da-4bbe-8d68-6a02148e072c\" (UID: \"0f30a029-08da-4bbe-8d68-6a02148e072c\") " Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.033815 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0f30a029-08da-4bbe-8d68-6a02148e072c" (UID: "0f30a029-08da-4bbe-8d68-6a02148e072c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.034050 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.034620 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0f30a029-08da-4bbe-8d68-6a02148e072c" (UID: "0f30a029-08da-4bbe-8d68-6a02148e072c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.041940 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8e6711-063d-42f2-afb6-426e7617d062-kube-api-access-fvzs7" (OuterVolumeSpecName: "kube-api-access-fvzs7") pod "ef8e6711-063d-42f2-afb6-426e7617d062" (UID: "ef8e6711-063d-42f2-afb6-426e7617d062"). InnerVolumeSpecName "kube-api-access-fvzs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.043619 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-scripts" (OuterVolumeSpecName: "scripts") pod "0f30a029-08da-4bbe-8d68-6a02148e072c" (UID: "0f30a029-08da-4bbe-8d68-6a02148e072c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.048786 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f30a029-08da-4bbe-8d68-6a02148e072c-kube-api-access-kpdkv" (OuterVolumeSpecName: "kube-api-access-kpdkv") pod "0f30a029-08da-4bbe-8d68-6a02148e072c" (UID: "0f30a029-08da-4bbe-8d68-6a02148e072c"). InnerVolumeSpecName "kube-api-access-kpdkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.069518 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0f30a029-08da-4bbe-8d68-6a02148e072c" (UID: "0f30a029-08da-4bbe-8d68-6a02148e072c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.081350 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-config" (OuterVolumeSpecName: "config") pod "ef8e6711-063d-42f2-afb6-426e7617d062" (UID: "ef8e6711-063d-42f2-afb6-426e7617d062"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.082861 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef8e6711-063d-42f2-afb6-426e7617d062" (UID: "ef8e6711-063d-42f2-afb6-426e7617d062"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.100865 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f30a029-08da-4bbe-8d68-6a02148e072c" (UID: "0f30a029-08da-4bbe-8d68-6a02148e072c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.109090 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-config-data" (OuterVolumeSpecName: "config-data") pod "0f30a029-08da-4bbe-8d68-6a02148e072c" (UID: "0f30a029-08da-4bbe-8d68-6a02148e072c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135510 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135549 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvzs7\" (UniqueName: \"kubernetes.io/projected/ef8e6711-063d-42f2-afb6-426e7617d062-kube-api-access-fvzs7\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135584 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpdkv\" (UniqueName: \"kubernetes.io/projected/0f30a029-08da-4bbe-8d68-6a02148e072c-kube-api-access-kpdkv\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135593 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f30a029-08da-4bbe-8d68-6a02148e072c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135604 4882 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135613 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135620 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135628 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef8e6711-063d-42f2-afb6-426e7617d062-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.135659 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f30a029-08da-4bbe-8d68-6a02148e072c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.598972 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.599076 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f30a029-08da-4bbe-8d68-6a02148e072c","Type":"ContainerDied","Data":"86dc7981059b9d97b0eedfb462a4cc0174221e5e19c4ea286cc91b5cf589d50d"} Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.599141 4882 scope.go:117] "RemoveContainer" containerID="6165d613d1a7aef5ed971f11979a462c24039c6378ae7708275350821777e5e6" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.604877 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8rr77" event={"ID":"ef8e6711-063d-42f2-afb6-426e7617d062","Type":"ContainerDied","Data":"53bcd453860992215d3d81580e89ab9189e91b8e0a53bbace5d197a907f7b694"} Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.604931 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53bcd453860992215d3d81580e89ab9189e91b8e0a53bbace5d197a907f7b694" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.604903 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8rr77" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.610619 4882 generic.go:334] "Generic (PLEG): container finished" podID="68a0980b-843c-4fdd-a638-fe28f7bf4491" containerID="929fdba2157f4a815e2411de0bf84cb77c8d2439b65e441725cfba13acde8d5c" exitCode=0 Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.610671 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k42n2" event={"ID":"68a0980b-843c-4fdd-a638-fe28f7bf4491","Type":"ContainerDied","Data":"929fdba2157f4a815e2411de0bf84cb77c8d2439b65e441725cfba13acde8d5c"} Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.644982 4882 scope.go:117] "RemoveContainer" containerID="31ce9aefe555c13850ce04c42dbc71bf18ceeaa39ebe012a02500b2d5badf800" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.666071 4882 scope.go:117] "RemoveContainer" containerID="f81ad1b31da85492aee00a390eeeaf1ec2977a5b45ffe9c0c865b0263375acbd" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.683352 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.701512 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.711444 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:16 crc kubenswrapper[4882]: E1002 16:38:16.713711 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="proxy-httpd" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.713853 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="proxy-httpd" Oct 02 16:38:16 crc kubenswrapper[4882]: E1002 16:38:16.713890 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="sg-core" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.713899 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="sg-core" Oct 02 16:38:16 crc kubenswrapper[4882]: E1002 16:38:16.713958 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="ceilometer-notification-agent" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.713967 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="ceilometer-notification-agent" Oct 02 16:38:16 crc kubenswrapper[4882]: E1002 16:38:16.714027 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8e6711-063d-42f2-afb6-426e7617d062" containerName="neutron-db-sync" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.714037 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8e6711-063d-42f2-afb6-426e7617d062" containerName="neutron-db-sync" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.714447 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="proxy-httpd" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.714505 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="sg-core" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.714528 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" containerName="ceilometer-notification-agent" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.714580 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8e6711-063d-42f2-afb6-426e7617d062" containerName="neutron-db-sync" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.717577 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.720854 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.721060 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.725063 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.747545 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-config-data\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.747599 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.747645 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-log-httpd\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.747693 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-scripts\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.747757 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-run-httpd\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.747834 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.747863 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr5vb\" (UniqueName: \"kubernetes.io/projected/0264e02a-1dfc-4dc4-8285-bc40e270916d-kube-api-access-dr5vb\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.772274 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f30a029-08da-4bbe-8d68-6a02148e072c" path="/var/lib/kubelet/pods/0f30a029-08da-4bbe-8d68-6a02148e072c/volumes" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.849446 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-scripts\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.849533 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-run-httpd\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.849605 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.849628 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr5vb\" (UniqueName: \"kubernetes.io/projected/0264e02a-1dfc-4dc4-8285-bc40e270916d-kube-api-access-dr5vb\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.849669 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-config-data\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.849686 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.849711 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-log-httpd\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.850123 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-log-httpd\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.850762 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-run-httpd\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.857244 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-scripts\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.858169 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-config-data\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.866422 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.879998 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.881723 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dfd5bcfb5-s5hvm"] Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.883424 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.888941 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr5vb\" (UniqueName: \"kubernetes.io/projected/0264e02a-1dfc-4dc4-8285-bc40e270916d-kube-api-access-dr5vb\") pod \"ceilometer-0\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " pod="openstack/ceilometer-0" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.917787 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dfd5bcfb5-s5hvm"] Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.931674 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fb74689f6-xjtxd"] Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.939925 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.944546 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fb74689f6-xjtxd"] Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.945918 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wnxs6" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.946112 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.946238 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.946554 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.951159 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-nb\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.951238 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-svc\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.951287 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-sb\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.951338 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-config\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.951388 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-swift-storage-0\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:16 crc kubenswrapper[4882]: I1002 16:38:16.951417 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47wzj\" (UniqueName: \"kubernetes.io/projected/ced43e79-8931-43f2-8984-8d4db604759e-kube-api-access-47wzj\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.036577 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056524 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-swift-storage-0\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056586 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47wzj\" (UniqueName: \"kubernetes.io/projected/ced43e79-8931-43f2-8984-8d4db604759e-kube-api-access-47wzj\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056627 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-ovndb-tls-certs\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056660 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-httpd-config\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056717 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-combined-ca-bundle\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056756 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-nb\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056783 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-svc\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056808 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2qf\" (UniqueName: \"kubernetes.io/projected/a7ab9abe-db2d-4e60-be2f-89eef3908282-kube-api-access-vr2qf\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056847 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-sb\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056878 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-config\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.056919 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-config\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.057797 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-config\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.058561 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-nb\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.058833 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-svc\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.059001 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-sb\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.069040 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-swift-storage-0\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.077052 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47wzj\" (UniqueName: \"kubernetes.io/projected/ced43e79-8931-43f2-8984-8d4db604759e-kube-api-access-47wzj\") pod \"dnsmasq-dns-dfd5bcfb5-s5hvm\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.159024 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-config\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.162453 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-ovndb-tls-certs\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.162540 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-httpd-config\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.163075 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-combined-ca-bundle\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.163162 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2qf\" (UniqueName: \"kubernetes.io/projected/a7ab9abe-db2d-4e60-be2f-89eef3908282-kube-api-access-vr2qf\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.172425 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-config\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.173356 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-ovndb-tls-certs\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.173783 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-httpd-config\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.174308 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-combined-ca-bundle\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.185509 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2qf\" (UniqueName: \"kubernetes.io/projected/a7ab9abe-db2d-4e60-be2f-89eef3908282-kube-api-access-vr2qf\") pod \"neutron-fb74689f6-xjtxd\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.276445 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.284971 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:17 crc kubenswrapper[4882]: W1002 16:38:17.304345 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0264e02a_1dfc_4dc4_8285_bc40e270916d.slice/crio-ff9d994a17bd00c403f560fcb65cf12cebbefad3ac60fff4588dcaac5b747b03 WatchSource:0}: Error finding container ff9d994a17bd00c403f560fcb65cf12cebbefad3ac60fff4588dcaac5b747b03: Status 404 returned error can't find the container with id ff9d994a17bd00c403f560fcb65cf12cebbefad3ac60fff4588dcaac5b747b03 Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.313417 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.612019 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dfd5bcfb5-s5hvm"] Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.620576 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerStarted","Data":"ff9d994a17bd00c403f560fcb65cf12cebbefad3ac60fff4588dcaac5b747b03"} Oct 02 16:38:17 crc kubenswrapper[4882]: I1002 16:38:17.976952 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fb74689f6-xjtxd"] Oct 02 16:38:18 crc kubenswrapper[4882]: I1002 16:38:18.636909 4882 generic.go:334] "Generic (PLEG): container finished" podID="39804fe6-5476-4f26-a743-83c1852229ec" containerID="0a8b48d27f4da685ade95ade8fb027b77858817f1978dd7e0a894eef8c0afdff" exitCode=0 Oct 02 16:38:18 crc kubenswrapper[4882]: I1002 16:38:18.637038 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-spmcq" event={"ID":"39804fe6-5476-4f26-a743-83c1852229ec","Type":"ContainerDied","Data":"0a8b48d27f4da685ade95ade8fb027b77858817f1978dd7e0a894eef8c0afdff"} Oct 02 16:38:18 crc kubenswrapper[4882]: I1002 16:38:18.640864 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" event={"ID":"ced43e79-8931-43f2-8984-8d4db604759e","Type":"ContainerStarted","Data":"723b1327484818910db7b4a5439c15fea130428af589d6f10d9651dc4f0594a1"} Oct 02 16:38:19 crc kubenswrapper[4882]: I1002 16:38:19.915492 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7994f6475f-bw8mv"] Oct 02 16:38:19 crc kubenswrapper[4882]: I1002 16:38:19.924897 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:19 crc kubenswrapper[4882]: I1002 16:38:19.929337 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 02 16:38:19 crc kubenswrapper[4882]: I1002 16:38:19.929560 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 02 16:38:19 crc kubenswrapper[4882]: I1002 16:38:19.939441 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7994f6475f-bw8mv"] Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.042475 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-combined-ca-bundle\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.042593 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-internal-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.042628 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-config\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.042658 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-public-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.042696 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-ovndb-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.042801 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-httpd-config\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.042863 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sw6q\" (UniqueName: \"kubernetes.io/projected/0fd93bba-dd83-4256-952a-d60fd3cefef4-kube-api-access-4sw6q\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.143951 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-combined-ca-bundle\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.144017 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-internal-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.144049 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-config\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.144077 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-public-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.144122 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-ovndb-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.144184 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-httpd-config\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.144782 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sw6q\" (UniqueName: \"kubernetes.io/projected/0fd93bba-dd83-4256-952a-d60fd3cefef4-kube-api-access-4sw6q\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.149257 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-ovndb-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.149482 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-internal-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.150349 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-httpd-config\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.150500 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-config\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.151652 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-combined-ca-bundle\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.156382 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-public-tls-certs\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.162033 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sw6q\" (UniqueName: \"kubernetes.io/projected/0fd93bba-dd83-4256-952a-d60fd3cefef4-kube-api-access-4sw6q\") pod \"neutron-7994f6475f-bw8mv\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: W1002 16:38:20.170450 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7ab9abe_db2d_4e60_be2f_89eef3908282.slice/crio-eda32871d96b18dcb6bd67cc00dd51e48e9da6a42514e4454a22348ea045ae08 WatchSource:0}: Error finding container eda32871d96b18dcb6bd67cc00dd51e48e9da6a42514e4454a22348ea045ae08: Status 404 returned error can't find the container with id eda32871d96b18dcb6bd67cc00dd51e48e9da6a42514e4454a22348ea045ae08 Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.251476 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.434259 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k42n2" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.442856 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-spmcq" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.555342 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-config-data\") pod \"39804fe6-5476-4f26-a743-83c1852229ec\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.555557 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39804fe6-5476-4f26-a743-83c1852229ec-etc-machine-id\") pod \"39804fe6-5476-4f26-a743-83c1852229ec\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.555664 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-combined-ca-bundle\") pod \"68a0980b-843c-4fdd-a638-fe28f7bf4491\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.555834 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-db-sync-config-data\") pod \"39804fe6-5476-4f26-a743-83c1852229ec\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.555921 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-combined-ca-bundle\") pod \"39804fe6-5476-4f26-a743-83c1852229ec\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.555976 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwrkd\" (UniqueName: \"kubernetes.io/projected/39804fe6-5476-4f26-a743-83c1852229ec-kube-api-access-qwrkd\") pod \"39804fe6-5476-4f26-a743-83c1852229ec\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.556017 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfjj7\" (UniqueName: \"kubernetes.io/projected/68a0980b-843c-4fdd-a638-fe28f7bf4491-kube-api-access-vfjj7\") pod \"68a0980b-843c-4fdd-a638-fe28f7bf4491\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.556055 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-db-sync-config-data\") pod \"68a0980b-843c-4fdd-a638-fe28f7bf4491\" (UID: \"68a0980b-843c-4fdd-a638-fe28f7bf4491\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.556089 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-scripts\") pod \"39804fe6-5476-4f26-a743-83c1852229ec\" (UID: \"39804fe6-5476-4f26-a743-83c1852229ec\") " Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.555644 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39804fe6-5476-4f26-a743-83c1852229ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "39804fe6-5476-4f26-a743-83c1852229ec" (UID: "39804fe6-5476-4f26-a743-83c1852229ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.562054 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a0980b-843c-4fdd-a638-fe28f7bf4491-kube-api-access-vfjj7" (OuterVolumeSpecName: "kube-api-access-vfjj7") pod "68a0980b-843c-4fdd-a638-fe28f7bf4491" (UID: "68a0980b-843c-4fdd-a638-fe28f7bf4491"). InnerVolumeSpecName "kube-api-access-vfjj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.562537 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "68a0980b-843c-4fdd-a638-fe28f7bf4491" (UID: "68a0980b-843c-4fdd-a638-fe28f7bf4491"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.568833 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39804fe6-5476-4f26-a743-83c1852229ec-kube-api-access-qwrkd" (OuterVolumeSpecName: "kube-api-access-qwrkd") pod "39804fe6-5476-4f26-a743-83c1852229ec" (UID: "39804fe6-5476-4f26-a743-83c1852229ec"). InnerVolumeSpecName "kube-api-access-qwrkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.570366 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "39804fe6-5476-4f26-a743-83c1852229ec" (UID: "39804fe6-5476-4f26-a743-83c1852229ec"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.575434 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-scripts" (OuterVolumeSpecName: "scripts") pod "39804fe6-5476-4f26-a743-83c1852229ec" (UID: "39804fe6-5476-4f26-a743-83c1852229ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.594837 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39804fe6-5476-4f26-a743-83c1852229ec" (UID: "39804fe6-5476-4f26-a743-83c1852229ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.596168 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68a0980b-843c-4fdd-a638-fe28f7bf4491" (UID: "68a0980b-843c-4fdd-a638-fe28f7bf4491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.642655 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-config-data" (OuterVolumeSpecName: "config-data") pod "39804fe6-5476-4f26-a743-83c1852229ec" (UID: "39804fe6-5476-4f26-a743-83c1852229ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.660976 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.661019 4882 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39804fe6-5476-4f26-a743-83c1852229ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.661034 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.661047 4882 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.661060 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.661071 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwrkd\" (UniqueName: \"kubernetes.io/projected/39804fe6-5476-4f26-a743-83c1852229ec-kube-api-access-qwrkd\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.661083 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfjj7\" (UniqueName: \"kubernetes.io/projected/68a0980b-843c-4fdd-a638-fe28f7bf4491-kube-api-access-vfjj7\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.661095 4882 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/68a0980b-843c-4fdd-a638-fe28f7bf4491-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.661106 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39804fe6-5476-4f26-a743-83c1852229ec-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.667279 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerStarted","Data":"e4a13efc451885f02f25e9cf364db2fec60e58e398a693cacb7fa46d56f790f1"} Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.677482 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-spmcq" event={"ID":"39804fe6-5476-4f26-a743-83c1852229ec","Type":"ContainerDied","Data":"19ff9e29704d4744761969da2f8a685c5aa710805cf94f5bb9256d82e7548503"} Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.677550 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19ff9e29704d4744761969da2f8a685c5aa710805cf94f5bb9256d82e7548503" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.677504 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-spmcq" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.679258 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k42n2" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.679349 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k42n2" event={"ID":"68a0980b-843c-4fdd-a638-fe28f7bf4491","Type":"ContainerDied","Data":"1b734e147bbbe028ca4cc47abc817345193e3fc88c3b1432136477d76a469ca8"} Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.679388 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b734e147bbbe028ca4cc47abc817345193e3fc88c3b1432136477d76a469ca8" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.682957 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb74689f6-xjtxd" event={"ID":"a7ab9abe-db2d-4e60-be2f-89eef3908282","Type":"ContainerStarted","Data":"7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693"} Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.683010 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb74689f6-xjtxd" event={"ID":"a7ab9abe-db2d-4e60-be2f-89eef3908282","Type":"ContainerStarted","Data":"eda32871d96b18dcb6bd67cc00dd51e48e9da6a42514e4454a22348ea045ae08"} Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.684477 4882 generic.go:334] "Generic (PLEG): container finished" podID="ced43e79-8931-43f2-8984-8d4db604759e" containerID="a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716" exitCode=0 Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.684510 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" event={"ID":"ced43e79-8931-43f2-8984-8d4db604759e","Type":"ContainerDied","Data":"a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716"} Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.894566 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:20 crc kubenswrapper[4882]: E1002 16:38:20.895502 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a0980b-843c-4fdd-a638-fe28f7bf4491" containerName="barbican-db-sync" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.895522 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a0980b-843c-4fdd-a638-fe28f7bf4491" containerName="barbican-db-sync" Oct 02 16:38:20 crc kubenswrapper[4882]: E1002 16:38:20.895562 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39804fe6-5476-4f26-a743-83c1852229ec" containerName="cinder-db-sync" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.895569 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="39804fe6-5476-4f26-a743-83c1852229ec" containerName="cinder-db-sync" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.895813 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a0980b-843c-4fdd-a638-fe28f7bf4491" containerName="barbican-db-sync" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.895836 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="39804fe6-5476-4f26-a743-83c1852229ec" containerName="cinder-db-sync" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.897099 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.903118 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.911948 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-pr5rh" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.912102 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.912952 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.914311 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.978051 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dfd5bcfb5-s5hvm"] Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.982996 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.983076 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.983117 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.983162 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.983251 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.983325 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddrzk\" (UniqueName: \"kubernetes.io/projected/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-kube-api-access-ddrzk\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:20 crc kubenswrapper[4882]: I1002 16:38:20.996904 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7994f6475f-bw8mv"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.052503 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dcc985965-nv5s6"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.054857 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.081058 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dcc985965-nv5s6"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.088283 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddrzk\" (UniqueName: \"kubernetes.io/projected/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-kube-api-access-ddrzk\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.094273 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.094446 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.094532 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.094649 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.094848 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.097565 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.104162 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.105931 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.106485 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.116584 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.117275 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddrzk\" (UniqueName: \"kubernetes.io/projected/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-kube-api-access-ddrzk\") pod \"cinder-scheduler-0\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.196999 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-svc\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.197080 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-sb\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.197198 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-swift-storage-0\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.197253 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-config\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.197353 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-nb\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.197400 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ssp\" (UniqueName: \"kubernetes.io/projected/910ee338-ad58-468b-a2b6-08121fe1cadd-kube-api-access-z5ssp\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.200065 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.201921 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.212701 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.262325 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.282769 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.308890 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-scripts\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.308977 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-logs\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309020 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhv7v\" (UniqueName: \"kubernetes.io/projected/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-kube-api-access-rhv7v\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309070 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-nb\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309107 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309156 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309193 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ssp\" (UniqueName: \"kubernetes.io/projected/910ee338-ad58-468b-a2b6-08121fe1cadd-kube-api-access-z5ssp\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309305 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-svc\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309331 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data-custom\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309435 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-sb\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309490 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309539 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-swift-storage-0\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.309570 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-config\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.310633 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-config\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.310764 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-nb\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.311408 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-svc\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.311528 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-swift-storage-0\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.312852 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-sb\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.387016 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ssp\" (UniqueName: \"kubernetes.io/projected/910ee338-ad58-468b-a2b6-08121fe1cadd-kube-api-access-z5ssp\") pod \"dnsmasq-dns-dcc985965-nv5s6\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.428819 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.436864 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-scripts\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.436919 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-logs\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.436975 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhv7v\" (UniqueName: \"kubernetes.io/projected/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-kube-api-access-rhv7v\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.437043 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.437065 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.437182 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data-custom\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.437335 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.438284 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.440749 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-logs\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.446434 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data-custom\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.452672 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.461700 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-scripts\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.465833 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.476498 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhv7v\" (UniqueName: \"kubernetes.io/projected/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-kube-api-access-rhv7v\") pod \"cinder-api-0\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.732031 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.749744 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-68fcbbc6fc-4n89r"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.752889 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.756476 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.756779 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.758809 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bfwps" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.769537 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerStarted","Data":"a60e6c9cdcc2714bfb5170a13236c8bfb097fd20cb13a05ab49aa870f71ada34"} Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.776818 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-dccccfdb8-h8ffg"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.779395 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.780795 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb74689f6-xjtxd" event={"ID":"a7ab9abe-db2d-4e60-be2f-89eef3908282","Type":"ContainerStarted","Data":"2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5"} Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.781066 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.787160 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.802977 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7994f6475f-bw8mv" event={"ID":"0fd93bba-dd83-4256-952a-d60fd3cefef4","Type":"ContainerStarted","Data":"94c670bb838359315515a372b7547069910917af30838091af0996d23b21c9e8"} Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.804498 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7994f6475f-bw8mv" event={"ID":"0fd93bba-dd83-4256-952a-d60fd3cefef4","Type":"ContainerStarted","Data":"c7c496ab2be5e70623612e2f8bc917a18fad6211fd9c5cf636941cdde346d7ed"} Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.820559 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68fcbbc6fc-4n89r"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.826224 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" podUID="ced43e79-8931-43f2-8984-8d4db604759e" containerName="dnsmasq-dns" containerID="cri-o://7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669" gracePeriod=10 Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.826037 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" event={"ID":"ced43e79-8931-43f2-8984-8d4db604759e","Type":"ContainerStarted","Data":"7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669"} Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.826608 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.833911 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-dccccfdb8-h8ffg"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851203 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-combined-ca-bundle\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851260 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data-custom\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851303 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6733b4d-ebf1-43cd-9960-3c25fca82e64-logs\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851338 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-combined-ca-bundle\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851406 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6h9q\" (UniqueName: \"kubernetes.io/projected/b80a4ca1-90cc-4c29-a2de-13b4db198cef-kube-api-access-d6h9q\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851433 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data-custom\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851463 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znmr2\" (UniqueName: \"kubernetes.io/projected/d6733b4d-ebf1-43cd-9960-3c25fca82e64-kube-api-access-znmr2\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851486 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851515 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.851555 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b80a4ca1-90cc-4c29-a2de-13b4db198cef-logs\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.878614 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fb74689f6-xjtxd" podStartSLOduration=5.878587925 podStartE2EDuration="5.878587925s" podCreationTimestamp="2025-10-02 16:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:21.849869709 +0000 UTC m=+1260.599099236" watchObservedRunningTime="2025-10-02 16:38:21.878587925 +0000 UTC m=+1260.627817462" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.880664 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dcc985965-nv5s6"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.908134 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d9668db97-bkpfp"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.911477 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.929680 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9668db97-bkpfp"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.944258 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" podStartSLOduration=5.944233365 podStartE2EDuration="5.944233365s" podCreationTimestamp="2025-10-02 16:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:21.905970248 +0000 UTC m=+1260.655199785" watchObservedRunningTime="2025-10-02 16:38:21.944233365 +0000 UTC m=+1260.693462892" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954023 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6h9q\" (UniqueName: \"kubernetes.io/projected/b80a4ca1-90cc-4c29-a2de-13b4db198cef-kube-api-access-d6h9q\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954078 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data-custom\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954121 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znmr2\" (UniqueName: \"kubernetes.io/projected/d6733b4d-ebf1-43cd-9960-3c25fca82e64-kube-api-access-znmr2\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954144 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954176 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954205 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b80a4ca1-90cc-4c29-a2de-13b4db198cef-logs\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954240 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-combined-ca-bundle\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954258 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data-custom\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954298 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6733b4d-ebf1-43cd-9960-3c25fca82e64-logs\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.954319 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-combined-ca-bundle\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.962839 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-combined-ca-bundle\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.963458 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b80a4ca1-90cc-4c29-a2de-13b4db198cef-logs\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.975671 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.977290 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6733b4d-ebf1-43cd-9960-3c25fca82e64-logs\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.981469 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-combined-ca-bundle\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:21 crc kubenswrapper[4882]: I1002 16:38:21.996478 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.003255 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.003396 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data-custom\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.010014 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data-custom\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.019847 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znmr2\" (UniqueName: \"kubernetes.io/projected/d6733b4d-ebf1-43cd-9960-3c25fca82e64-kube-api-access-znmr2\") pod \"barbican-keystone-listener-dccccfdb8-h8ffg\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.040491 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75654db7dd-lcgbh"] Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.040732 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6h9q\" (UniqueName: \"kubernetes.io/projected/b80a4ca1-90cc-4c29-a2de-13b4db198cef-kube-api-access-d6h9q\") pod \"barbican-worker-68fcbbc6fc-4n89r\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.042352 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.056246 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-config\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.056299 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.056326 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.056376 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpd2\" (UniqueName: \"kubernetes.io/projected/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-kube-api-access-bvpd2\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.056438 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-svc\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.056495 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.058534 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75654db7dd-lcgbh"] Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.069655 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.107678 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.121918 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.193068 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-combined-ca-bundle\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.205162 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.205502 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5dhn\" (UniqueName: \"kubernetes.io/projected/841fb0d0-b75f-4a4f-8418-85891aab0cf2-kube-api-access-t5dhn\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.205621 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/841fb0d0-b75f-4a4f-8418-85891aab0cf2-logs\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.205744 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-config\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.205811 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data-custom\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.205889 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.205965 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.206092 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpd2\" (UniqueName: \"kubernetes.io/projected/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-kube-api-access-bvpd2\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.206165 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.206338 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.206430 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-svc\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.207151 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-config\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.207300 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-svc\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.207709 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.208007 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.250343 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dcc985965-nv5s6"] Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.266263 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpd2\" (UniqueName: \"kubernetes.io/projected/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-kube-api-access-bvpd2\") pod \"dnsmasq-dns-5d9668db97-bkpfp\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.320147 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.320677 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5dhn\" (UniqueName: \"kubernetes.io/projected/841fb0d0-b75f-4a4f-8418-85891aab0cf2-kube-api-access-t5dhn\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.320743 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/841fb0d0-b75f-4a4f-8418-85891aab0cf2-logs\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.320779 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data-custom\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.320867 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.320921 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-combined-ca-bundle\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.321369 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/841fb0d0-b75f-4a4f-8418-85891aab0cf2-logs\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.361731 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5dhn\" (UniqueName: \"kubernetes.io/projected/841fb0d0-b75f-4a4f-8418-85891aab0cf2-kube-api-access-t5dhn\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.362116 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-combined-ca-bundle\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.363362 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.366362 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data-custom\") pod \"barbican-api-75654db7dd-lcgbh\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.407941 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.579714 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.639939 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-config\") pod \"ced43e79-8931-43f2-8984-8d4db604759e\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.640325 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47wzj\" (UniqueName: \"kubernetes.io/projected/ced43e79-8931-43f2-8984-8d4db604759e-kube-api-access-47wzj\") pod \"ced43e79-8931-43f2-8984-8d4db604759e\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.640355 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-sb\") pod \"ced43e79-8931-43f2-8984-8d4db604759e\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.640379 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-swift-storage-0\") pod \"ced43e79-8931-43f2-8984-8d4db604759e\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.640422 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-nb\") pod \"ced43e79-8931-43f2-8984-8d4db604759e\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.640447 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-svc\") pod \"ced43e79-8931-43f2-8984-8d4db604759e\" (UID: \"ced43e79-8931-43f2-8984-8d4db604759e\") " Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.652518 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced43e79-8931-43f2-8984-8d4db604759e-kube-api-access-47wzj" (OuterVolumeSpecName: "kube-api-access-47wzj") pod "ced43e79-8931-43f2-8984-8d4db604759e" (UID: "ced43e79-8931-43f2-8984-8d4db604759e"). InnerVolumeSpecName "kube-api-access-47wzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:22 crc kubenswrapper[4882]: I1002 16:38:22.796201 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47wzj\" (UniqueName: \"kubernetes.io/projected/ced43e79-8931-43f2-8984-8d4db604759e-kube-api-access-47wzj\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.014512 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.015359 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-dccccfdb8-h8ffg"] Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.037030 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dcc985965-nv5s6" event={"ID":"910ee338-ad58-468b-a2b6-08121fe1cadd","Type":"ContainerStarted","Data":"8abd3fc522404a71c144a23c141d1885c61123e243f048665125b0f2e0984375"} Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.069581 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-config" (OuterVolumeSpecName: "config") pod "ced43e79-8931-43f2-8984-8d4db604759e" (UID: "ced43e79-8931-43f2-8984-8d4db604759e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.093490 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ced43e79-8931-43f2-8984-8d4db604759e" (UID: "ced43e79-8931-43f2-8984-8d4db604759e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.093773 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7994f6475f-bw8mv" event={"ID":"0fd93bba-dd83-4256-952a-d60fd3cefef4","Type":"ContainerStarted","Data":"0917f875d2bdf0cb2a5ee7365787c2bcba638e2049e9094611df5f28dc9f15e9"} Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.105094 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.117955 4882 generic.go:334] "Generic (PLEG): container finished" podID="ced43e79-8931-43f2-8984-8d4db604759e" containerID="7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669" exitCode=0 Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.118247 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" event={"ID":"ced43e79-8931-43f2-8984-8d4db604759e","Type":"ContainerDied","Data":"7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669"} Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.119672 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" event={"ID":"ced43e79-8931-43f2-8984-8d4db604759e","Type":"ContainerDied","Data":"723b1327484818910db7b4a5439c15fea130428af589d6f10d9651dc4f0594a1"} Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.119795 4882 scope.go:117] "RemoveContainer" containerID="7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.120068 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dfd5bcfb5-s5hvm" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.129009 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.130440 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.145733 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ced43e79-8931-43f2-8984-8d4db604759e" (UID: "ced43e79-8931-43f2-8984-8d4db604759e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.158067 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e","Type":"ContainerStarted","Data":"79e0e8af47654b155118835e5f21fa8b7d9472519d2ff181b14a5a5b0039a9e1"} Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.200646 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ced43e79-8931-43f2-8984-8d4db604759e" (UID: "ced43e79-8931-43f2-8984-8d4db604759e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.237054 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.239199 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.282301 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68fcbbc6fc-4n89r"] Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.297146 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ced43e79-8931-43f2-8984-8d4db604759e" (UID: "ced43e79-8931-43f2-8984-8d4db604759e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.335075 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7994f6475f-bw8mv" podStartSLOduration=4.3350549560000005 podStartE2EDuration="4.335054956s" podCreationTimestamp="2025-10-02 16:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:23.211739038 +0000 UTC m=+1261.960968565" watchObservedRunningTime="2025-10-02 16:38:23.335054956 +0000 UTC m=+1262.084284483" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.343359 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ced43e79-8931-43f2-8984-8d4db604759e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.348034 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9668db97-bkpfp"] Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.404688 4882 scope.go:117] "RemoveContainer" containerID="92723c67f38f1e15c649e3eb09dfc180b433670f50274eca6f8c0254916fe00f" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.511421 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75654db7dd-lcgbh"] Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.523778 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dfd5bcfb5-s5hvm"] Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.538040 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dfd5bcfb5-s5hvm"] Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.568953 4882 scope.go:117] "RemoveContainer" containerID="17f6bab85dd9a04460dcce08857b45a763e3ab3a8e9824e170ce4c5f4540eb3a" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.680423 4882 scope.go:117] "RemoveContainer" containerID="a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.717710 4882 scope.go:117] "RemoveContainer" containerID="53e0cbb11866f92ebc74582d88c396361f95a738327cc8ea1b33dff834c5b3a0" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.765504 4882 scope.go:117] "RemoveContainer" containerID="7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669" Oct 02 16:38:23 crc kubenswrapper[4882]: E1002 16:38:23.765887 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669\": container with ID starting with 7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669 not found: ID does not exist" containerID="7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.765918 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669"} err="failed to get container status \"7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669\": rpc error: code = NotFound desc = could not find container \"7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669\": container with ID starting with 7059ab8897bc39e57a39ae954cbf71a66fde86ec8b81d168fa956f796532e669 not found: ID does not exist" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.765954 4882 scope.go:117] "RemoveContainer" containerID="a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716" Oct 02 16:38:23 crc kubenswrapper[4882]: E1002 16:38:23.766902 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716\": container with ID starting with a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716 not found: ID does not exist" containerID="a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716" Oct 02 16:38:23 crc kubenswrapper[4882]: I1002 16:38:23.766951 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716"} err="failed to get container status \"a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716\": rpc error: code = NotFound desc = could not find container \"a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716\": container with ID starting with a792da182c4a9120a6e1f8546dd1b1e7d094d3509248a758226c030b08d61716 not found: ID does not exist" Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.188587 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerStarted","Data":"994e1db5b6ed85179232e2d376a1b3cac69d434ee1a4c628383422c9e26e22cf"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.193911 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" event={"ID":"d6733b4d-ebf1-43cd-9960-3c25fca82e64","Type":"ContainerStarted","Data":"0e92b1baf65adc1945adace925b27e61d8fd9ec5f85e128c321df60451542625"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.198910 4882 generic.go:334] "Generic (PLEG): container finished" podID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" containerID="f1d51776d3fb2ce27054a65d1e19cd19651fc285559fccdd2272d4ef37ee2f07" exitCode=0 Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.199049 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" event={"ID":"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3","Type":"ContainerDied","Data":"f1d51776d3fb2ce27054a65d1e19cd19651fc285559fccdd2272d4ef37ee2f07"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.199089 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" event={"ID":"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3","Type":"ContainerStarted","Data":"ebea4ec3ada93fda42e4ab51c78fb36211299b22c05befc6729a80e75b54c33a"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.214794 4882 generic.go:334] "Generic (PLEG): container finished" podID="910ee338-ad58-468b-a2b6-08121fe1cadd" containerID="6b88143fef829995acc1ff754aded8f7b238aad31f9e0d362d032fb7a0e672eb" exitCode=0 Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.214885 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dcc985965-nv5s6" event={"ID":"910ee338-ad58-468b-a2b6-08121fe1cadd","Type":"ContainerDied","Data":"6b88143fef829995acc1ff754aded8f7b238aad31f9e0d362d032fb7a0e672eb"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.227224 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75654db7dd-lcgbh" event={"ID":"841fb0d0-b75f-4a4f-8418-85891aab0cf2","Type":"ContainerStarted","Data":"ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.227274 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75654db7dd-lcgbh" event={"ID":"841fb0d0-b75f-4a4f-8418-85891aab0cf2","Type":"ContainerStarted","Data":"d664c084d0bf488bfca4c7d68739efa53ed15f711e4096faebc108a1a80c4342"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.232229 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2b9b628-f2ce-48a9-81f0-83591ae4ac84","Type":"ContainerStarted","Data":"9aae72ae40de443470dd80bbe9e2b9a8a2e2a18642c7c12acb12ae2ccac621d2"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.235347 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" event={"ID":"b80a4ca1-90cc-4c29-a2de-13b4db198cef","Type":"ContainerStarted","Data":"847a6b5f37acf49bf72646e223241418883ec7a098e17e1719709fcb52776e86"} Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.776671 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced43e79-8931-43f2-8984-8d4db604759e" path="/var/lib/kubelet/pods/ced43e79-8931-43f2-8984-8d4db604759e/volumes" Oct 02 16:38:24 crc kubenswrapper[4882]: I1002 16:38:24.976965 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.247613 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75654db7dd-lcgbh" event={"ID":"841fb0d0-b75f-4a4f-8418-85891aab0cf2","Type":"ContainerStarted","Data":"c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477"} Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.247707 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.247756 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.251057 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" event={"ID":"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3","Type":"ContainerStarted","Data":"797e8966d319d2ccb0ca1e140cb7f470e7fa35927face6c1c5d2f62e9ad7ed72"} Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.252198 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.255485 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2b9b628-f2ce-48a9-81f0-83591ae4ac84","Type":"ContainerStarted","Data":"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393"} Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.260561 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e","Type":"ContainerStarted","Data":"2227ef492f6b2d13f710d384f3d98fdd696edeeda9ecb9009912a2d6d56143d2"} Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.277167 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75654db7dd-lcgbh" podStartSLOduration=4.277146218 podStartE2EDuration="4.277146218s" podCreationTimestamp="2025-10-02 16:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:25.271761833 +0000 UTC m=+1264.020991360" watchObservedRunningTime="2025-10-02 16:38:25.277146218 +0000 UTC m=+1264.026375745" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.303007 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" podStartSLOduration=4.302987902 podStartE2EDuration="4.302987902s" podCreationTimestamp="2025-10-02 16:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:25.297276398 +0000 UTC m=+1264.046505925" watchObservedRunningTime="2025-10-02 16:38:25.302987902 +0000 UTC m=+1264.052217429" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.450975 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.618340 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-config\") pod \"910ee338-ad58-468b-a2b6-08121fe1cadd\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.618853 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-sb\") pod \"910ee338-ad58-468b-a2b6-08121fe1cadd\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.618894 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-svc\") pod \"910ee338-ad58-468b-a2b6-08121fe1cadd\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.618954 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-nb\") pod \"910ee338-ad58-468b-a2b6-08121fe1cadd\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.618987 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5ssp\" (UniqueName: \"kubernetes.io/projected/910ee338-ad58-468b-a2b6-08121fe1cadd-kube-api-access-z5ssp\") pod \"910ee338-ad58-468b-a2b6-08121fe1cadd\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.619057 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-swift-storage-0\") pod \"910ee338-ad58-468b-a2b6-08121fe1cadd\" (UID: \"910ee338-ad58-468b-a2b6-08121fe1cadd\") " Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.642908 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-config" (OuterVolumeSpecName: "config") pod "910ee338-ad58-468b-a2b6-08121fe1cadd" (UID: "910ee338-ad58-468b-a2b6-08121fe1cadd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.647522 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "910ee338-ad58-468b-a2b6-08121fe1cadd" (UID: "910ee338-ad58-468b-a2b6-08121fe1cadd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.647553 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910ee338-ad58-468b-a2b6-08121fe1cadd-kube-api-access-z5ssp" (OuterVolumeSpecName: "kube-api-access-z5ssp") pod "910ee338-ad58-468b-a2b6-08121fe1cadd" (UID: "910ee338-ad58-468b-a2b6-08121fe1cadd"). InnerVolumeSpecName "kube-api-access-z5ssp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.659721 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "910ee338-ad58-468b-a2b6-08121fe1cadd" (UID: "910ee338-ad58-468b-a2b6-08121fe1cadd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.665363 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "910ee338-ad58-468b-a2b6-08121fe1cadd" (UID: "910ee338-ad58-468b-a2b6-08121fe1cadd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.693139 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "910ee338-ad58-468b-a2b6-08121fe1cadd" (UID: "910ee338-ad58-468b-a2b6-08121fe1cadd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.721508 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.721560 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.721574 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.721587 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.721598 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5ssp\" (UniqueName: \"kubernetes.io/projected/910ee338-ad58-468b-a2b6-08121fe1cadd-kube-api-access-z5ssp\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:25 crc kubenswrapper[4882]: I1002 16:38:25.721609 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/910ee338-ad58-468b-a2b6-08121fe1cadd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:26 crc kubenswrapper[4882]: I1002 16:38:26.283281 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dcc985965-nv5s6" event={"ID":"910ee338-ad58-468b-a2b6-08121fe1cadd","Type":"ContainerDied","Data":"8abd3fc522404a71c144a23c141d1885c61123e243f048665125b0f2e0984375"} Oct 02 16:38:26 crc kubenswrapper[4882]: I1002 16:38:26.283375 4882 scope.go:117] "RemoveContainer" containerID="6b88143fef829995acc1ff754aded8f7b238aad31f9e0d362d032fb7a0e672eb" Oct 02 16:38:26 crc kubenswrapper[4882]: I1002 16:38:26.283329 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dcc985965-nv5s6" Oct 02 16:38:26 crc kubenswrapper[4882]: I1002 16:38:26.379078 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dcc985965-nv5s6"] Oct 02 16:38:26 crc kubenswrapper[4882]: I1002 16:38:26.424125 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dcc985965-nv5s6"] Oct 02 16:38:26 crc kubenswrapper[4882]: I1002 16:38:26.777822 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910ee338-ad58-468b-a2b6-08121fe1cadd" path="/var/lib/kubelet/pods/910ee338-ad58-468b-a2b6-08121fe1cadd/volumes" Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.297514 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" event={"ID":"b80a4ca1-90cc-4c29-a2de-13b4db198cef","Type":"ContainerStarted","Data":"b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224"} Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.297890 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" event={"ID":"b80a4ca1-90cc-4c29-a2de-13b4db198cef","Type":"ContainerStarted","Data":"1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed"} Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.306401 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e","Type":"ContainerStarted","Data":"fa70877f037a2b551f127b58f743f62d2cb6287cbea5a9f461cd1bd0f952ec86"} Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.311728 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerStarted","Data":"d2bca7c8b96365280c73a190e909e3383d53ecb7ff05b6f44fb139bd7006a923"} Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.312728 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.316957 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" event={"ID":"d6733b4d-ebf1-43cd-9960-3c25fca82e64","Type":"ContainerStarted","Data":"f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f"} Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.317008 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" event={"ID":"d6733b4d-ebf1-43cd-9960-3c25fca82e64","Type":"ContainerStarted","Data":"03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d"} Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.328629 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" podStartSLOduration=3.809408641 podStartE2EDuration="6.328612127s" podCreationTimestamp="2025-10-02 16:38:21 +0000 UTC" firstStartedPulling="2025-10-02 16:38:23.532644563 +0000 UTC m=+1262.281874090" lastFinishedPulling="2025-10-02 16:38:26.051848049 +0000 UTC m=+1264.801077576" observedRunningTime="2025-10-02 16:38:27.325342653 +0000 UTC m=+1266.074572190" watchObservedRunningTime="2025-10-02 16:38:27.328612127 +0000 UTC m=+1266.077841664" Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.330999 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2b9b628-f2ce-48a9-81f0-83591ae4ac84","Type":"ContainerStarted","Data":"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6"} Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.331270 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerName="cinder-api" containerID="cri-o://aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6" gracePeriod=30 Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.331641 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerName="cinder-api-log" containerID="cri-o://82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393" gracePeriod=30 Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.371999 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.65997427 podStartE2EDuration="7.371949563s" podCreationTimestamp="2025-10-02 16:38:20 +0000 UTC" firstStartedPulling="2025-10-02 16:38:22.009054344 +0000 UTC m=+1260.758283871" lastFinishedPulling="2025-10-02 16:38:23.721029637 +0000 UTC m=+1262.470259164" observedRunningTime="2025-10-02 16:38:27.357181939 +0000 UTC m=+1266.106411466" watchObservedRunningTime="2025-10-02 16:38:27.371949563 +0000 UTC m=+1266.121179100" Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.400020 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8461667520000002 podStartE2EDuration="11.400000682s" podCreationTimestamp="2025-10-02 16:38:16 +0000 UTC" firstStartedPulling="2025-10-02 16:38:17.313664067 +0000 UTC m=+1256.062893594" lastFinishedPulling="2025-10-02 16:38:25.867497997 +0000 UTC m=+1264.616727524" observedRunningTime="2025-10-02 16:38:27.390116851 +0000 UTC m=+1266.139346378" watchObservedRunningTime="2025-10-02 16:38:27.400000682 +0000 UTC m=+1266.149230209" Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.415237 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" podStartSLOduration=3.3676002560000002 podStartE2EDuration="6.415195985s" podCreationTimestamp="2025-10-02 16:38:21 +0000 UTC" firstStartedPulling="2025-10-02 16:38:22.986522772 +0000 UTC m=+1261.735752299" lastFinishedPulling="2025-10-02 16:38:26.034118501 +0000 UTC m=+1264.783348028" observedRunningTime="2025-10-02 16:38:27.413007691 +0000 UTC m=+1266.162237218" watchObservedRunningTime="2025-10-02 16:38:27.415195985 +0000 UTC m=+1266.164425512" Oct 02 16:38:27 crc kubenswrapper[4882]: I1002 16:38:27.438988 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.438967076 podStartE2EDuration="6.438967076s" podCreationTimestamp="2025-10-02 16:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:27.438534886 +0000 UTC m=+1266.187764413" watchObservedRunningTime="2025-10-02 16:38:27.438967076 +0000 UTC m=+1266.188196603" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.248897 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-68897cb7f8-5wgv6"] Oct 02 16:38:28 crc kubenswrapper[4882]: E1002 16:38:28.250102 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced43e79-8931-43f2-8984-8d4db604759e" containerName="dnsmasq-dns" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.250119 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced43e79-8931-43f2-8984-8d4db604759e" containerName="dnsmasq-dns" Oct 02 16:38:28 crc kubenswrapper[4882]: E1002 16:38:28.250127 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced43e79-8931-43f2-8984-8d4db604759e" containerName="init" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.250148 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced43e79-8931-43f2-8984-8d4db604759e" containerName="init" Oct 02 16:38:28 crc kubenswrapper[4882]: E1002 16:38:28.250170 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910ee338-ad58-468b-a2b6-08121fe1cadd" containerName="init" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.250177 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="910ee338-ad58-468b-a2b6-08121fe1cadd" containerName="init" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.250470 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced43e79-8931-43f2-8984-8d4db604759e" containerName="dnsmasq-dns" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.250492 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="910ee338-ad58-468b-a2b6-08121fe1cadd" containerName="init" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.251920 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.255454 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.255480 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.263877 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68897cb7f8-5wgv6"] Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.283794 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.353578 4882 generic.go:334] "Generic (PLEG): container finished" podID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerID="aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6" exitCode=0 Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.353614 4882 generic.go:334] "Generic (PLEG): container finished" podID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerID="82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393" exitCode=143 Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.353785 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2b9b628-f2ce-48a9-81f0-83591ae4ac84","Type":"ContainerDied","Data":"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6"} Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.353837 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2b9b628-f2ce-48a9-81f0-83591ae4ac84","Type":"ContainerDied","Data":"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393"} Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.353848 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2b9b628-f2ce-48a9-81f0-83591ae4ac84","Type":"ContainerDied","Data":"9aae72ae40de443470dd80bbe9e2b9a8a2e2a18642c7c12acb12ae2ccac621d2"} Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.353865 4882 scope.go:117] "RemoveContainer" containerID="aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.354723 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.384151 4882 scope.go:117] "RemoveContainer" containerID="82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.407406 4882 scope.go:117] "RemoveContainer" containerID="aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6" Oct 02 16:38:28 crc kubenswrapper[4882]: E1002 16:38:28.407957 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6\": container with ID starting with aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6 not found: ID does not exist" containerID="aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.408004 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6"} err="failed to get container status \"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6\": rpc error: code = NotFound desc = could not find container \"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6\": container with ID starting with aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6 not found: ID does not exist" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.408033 4882 scope.go:117] "RemoveContainer" containerID="82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393" Oct 02 16:38:28 crc kubenswrapper[4882]: E1002 16:38:28.408363 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393\": container with ID starting with 82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393 not found: ID does not exist" containerID="82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.408389 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393"} err="failed to get container status \"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393\": rpc error: code = NotFound desc = could not find container \"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393\": container with ID starting with 82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393 not found: ID does not exist" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.408404 4882 scope.go:117] "RemoveContainer" containerID="aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.408659 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6"} err="failed to get container status \"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6\": rpc error: code = NotFound desc = could not find container \"aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6\": container with ID starting with aef039ddae3cf3ef116ea73847f2da5f5480ad9227c4d2e6c70388f982a82ba6 not found: ID does not exist" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.408711 4882 scope.go:117] "RemoveContainer" containerID="82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.408991 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393"} err="failed to get container status \"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393\": rpc error: code = NotFound desc = could not find container \"82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393\": container with ID starting with 82f65111779e0de65b722b0fd9052bcdc37015623c8de19e1ff322fd9f732393 not found: ID does not exist" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.433327 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data\") pod \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.433467 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhv7v\" (UniqueName: \"kubernetes.io/projected/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-kube-api-access-rhv7v\") pod \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.433640 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-scripts\") pod \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.433685 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-combined-ca-bundle\") pod \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.433754 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data-custom\") pod \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.433868 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-logs\") pod \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.433949 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-etc-machine-id\") pod \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\" (UID: \"b2b9b628-f2ce-48a9-81f0-83591ae4ac84\") " Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.434379 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-combined-ca-bundle\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.434430 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data-custom\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.434472 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.434570 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-public-tls-certs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.434653 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8a47fb-b921-4b63-9529-a49d1ec506fb-logs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.434703 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlp57\" (UniqueName: \"kubernetes.io/projected/bb8a47fb-b921-4b63-9529-a49d1ec506fb-kube-api-access-rlp57\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.434741 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-internal-tls-certs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.439610 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-logs" (OuterVolumeSpecName: "logs") pod "b2b9b628-f2ce-48a9-81f0-83591ae4ac84" (UID: "b2b9b628-f2ce-48a9-81f0-83591ae4ac84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.439792 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b2b9b628-f2ce-48a9-81f0-83591ae4ac84" (UID: "b2b9b628-f2ce-48a9-81f0-83591ae4ac84"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.439896 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b2b9b628-f2ce-48a9-81f0-83591ae4ac84" (UID: "b2b9b628-f2ce-48a9-81f0-83591ae4ac84"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.440239 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-scripts" (OuterVolumeSpecName: "scripts") pod "b2b9b628-f2ce-48a9-81f0-83591ae4ac84" (UID: "b2b9b628-f2ce-48a9-81f0-83591ae4ac84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.457643 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-kube-api-access-rhv7v" (OuterVolumeSpecName: "kube-api-access-rhv7v") pod "b2b9b628-f2ce-48a9-81f0-83591ae4ac84" (UID: "b2b9b628-f2ce-48a9-81f0-83591ae4ac84"). InnerVolumeSpecName "kube-api-access-rhv7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.466439 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2b9b628-f2ce-48a9-81f0-83591ae4ac84" (UID: "b2b9b628-f2ce-48a9-81f0-83591ae4ac84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.494548 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data" (OuterVolumeSpecName: "config-data") pod "b2b9b628-f2ce-48a9-81f0-83591ae4ac84" (UID: "b2b9b628-f2ce-48a9-81f0-83591ae4ac84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.536851 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-public-tls-certs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537004 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8a47fb-b921-4b63-9529-a49d1ec506fb-logs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537050 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlp57\" (UniqueName: \"kubernetes.io/projected/bb8a47fb-b921-4b63-9529-a49d1ec506fb-kube-api-access-rlp57\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537072 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-internal-tls-certs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537113 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-combined-ca-bundle\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537144 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data-custom\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537173 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537261 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537274 4882 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537286 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537297 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhv7v\" (UniqueName: \"kubernetes.io/projected/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-kube-api-access-rhv7v\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537311 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537320 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.537329 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2b9b628-f2ce-48a9-81f0-83591ae4ac84-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.542548 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8a47fb-b921-4b63-9529-a49d1ec506fb-logs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.545875 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-public-tls-certs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.548838 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-internal-tls-certs\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.549382 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data-custom\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.549870 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-combined-ca-bundle\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.552091 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.560360 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlp57\" (UniqueName: \"kubernetes.io/projected/bb8a47fb-b921-4b63-9529-a49d1ec506fb-kube-api-access-rlp57\") pod \"barbican-api-68897cb7f8-5wgv6\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.592037 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.722012 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.739757 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.803618 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" path="/var/lib/kubelet/pods/b2b9b628-f2ce-48a9-81f0-83591ae4ac84/volumes" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.804971 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:28 crc kubenswrapper[4882]: E1002 16:38:28.815723 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerName="cinder-api-log" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.815768 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerName="cinder-api-log" Oct 02 16:38:28 crc kubenswrapper[4882]: E1002 16:38:28.815803 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerName="cinder-api" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.815811 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerName="cinder-api" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.816136 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerName="cinder-api" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.816157 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b9b628-f2ce-48a9-81f0-83591ae4ac84" containerName="cinder-api-log" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.817528 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.817648 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.824071 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.824437 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.830918 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.949725 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-scripts\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.950445 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.950565 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kj4z\" (UniqueName: \"kubernetes.io/projected/8a33ca09-ff99-44fd-a978-ef69315caf26-kube-api-access-8kj4z\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.950594 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.950621 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a33ca09-ff99-44fd-a978-ef69315caf26-logs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.950651 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a33ca09-ff99-44fd-a978-ef69315caf26-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.950989 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.951232 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:28 crc kubenswrapper[4882]: I1002 16:38:28.951447 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054238 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a33ca09-ff99-44fd-a978-ef69315caf26-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054360 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054409 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054465 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054515 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-scripts\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054658 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054712 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kj4z\" (UniqueName: \"kubernetes.io/projected/8a33ca09-ff99-44fd-a978-ef69315caf26-kube-api-access-8kj4z\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054746 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.054772 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a33ca09-ff99-44fd-a978-ef69315caf26-logs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.055423 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a33ca09-ff99-44fd-a978-ef69315caf26-logs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.055913 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a33ca09-ff99-44fd-a978-ef69315caf26-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.062260 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.063834 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-scripts\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.064379 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.064531 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.070534 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.071191 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.079196 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kj4z\" (UniqueName: \"kubernetes.io/projected/8a33ca09-ff99-44fd-a978-ef69315caf26-kube-api-access-8kj4z\") pod \"cinder-api-0\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.121735 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68897cb7f8-5wgv6"] Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.159691 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.392846 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68897cb7f8-5wgv6" event={"ID":"bb8a47fb-b921-4b63-9529-a49d1ec506fb","Type":"ContainerStarted","Data":"81bedb9aeb702667b8a6d51fc9ff288fc91ae9eafee405c8744188c4fd827eed"} Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.393201 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68897cb7f8-5wgv6" event={"ID":"bb8a47fb-b921-4b63-9529-a49d1ec506fb","Type":"ContainerStarted","Data":"c1531f1c4ec303ed31f6714cd270c9a3cd43e44dcabbf282527f44b222d6cbb3"} Oct 02 16:38:29 crc kubenswrapper[4882]: I1002 16:38:29.848006 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:38:29 crc kubenswrapper[4882]: W1002 16:38:29.862829 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a33ca09_ff99_44fd_a978_ef69315caf26.slice/crio-be82b60aebcd42d773671019413d9eac36d7c87b003ef838b1a35f586f897a59 WatchSource:0}: Error finding container be82b60aebcd42d773671019413d9eac36d7c87b003ef838b1a35f586f897a59: Status 404 returned error can't find the container with id be82b60aebcd42d773671019413d9eac36d7c87b003ef838b1a35f586f897a59 Oct 02 16:38:30 crc kubenswrapper[4882]: I1002 16:38:30.422990 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a33ca09-ff99-44fd-a978-ef69315caf26","Type":"ContainerStarted","Data":"be82b60aebcd42d773671019413d9eac36d7c87b003ef838b1a35f586f897a59"} Oct 02 16:38:30 crc kubenswrapper[4882]: I1002 16:38:30.433411 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68897cb7f8-5wgv6" event={"ID":"bb8a47fb-b921-4b63-9529-a49d1ec506fb","Type":"ContainerStarted","Data":"42bd27be6298749bedf886c1fdcb856b9c7bdde2ff094fd0bbcab807ca38fc43"} Oct 02 16:38:30 crc kubenswrapper[4882]: I1002 16:38:30.433796 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:30 crc kubenswrapper[4882]: I1002 16:38:30.433818 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:30 crc kubenswrapper[4882]: I1002 16:38:30.456987 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-68897cb7f8-5wgv6" podStartSLOduration=2.456966027 podStartE2EDuration="2.456966027s" podCreationTimestamp="2025-10-02 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:30.455319775 +0000 UTC m=+1269.204549302" watchObservedRunningTime="2025-10-02 16:38:30.456966027 +0000 UTC m=+1269.206195554" Oct 02 16:38:31 crc kubenswrapper[4882]: I1002 16:38:31.287500 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 16:38:31 crc kubenswrapper[4882]: I1002 16:38:31.451124 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a33ca09-ff99-44fd-a978-ef69315caf26","Type":"ContainerStarted","Data":"93e77877e930a932f31f96d8c2b7d4e11b5ab6f70e5fa57cb3caad68f215b647"} Oct 02 16:38:31 crc kubenswrapper[4882]: I1002 16:38:31.451680 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a33ca09-ff99-44fd-a978-ef69315caf26","Type":"ContainerStarted","Data":"d5a7ca97a26025df40fa9bac55807a7e950fe94bb5ce25a333f4e5a7efd30156"} Oct 02 16:38:31 crc kubenswrapper[4882]: I1002 16:38:31.510867 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.510845258 podStartE2EDuration="3.510845258s" podCreationTimestamp="2025-10-02 16:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:31.488241995 +0000 UTC m=+1270.237471542" watchObservedRunningTime="2025-10-02 16:38:31.510845258 +0000 UTC m=+1270.260074785" Oct 02 16:38:31 crc kubenswrapper[4882]: I1002 16:38:31.568383 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 16:38:31 crc kubenswrapper[4882]: I1002 16:38:31.611396 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:32 crc kubenswrapper[4882]: I1002 16:38:32.322461 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:38:32 crc kubenswrapper[4882]: I1002 16:38:32.401607 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c9d7754df-vk4cc"] Oct 02 16:38:32 crc kubenswrapper[4882]: I1002 16:38:32.402114 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" podUID="806e2852-f670-4614-9838-cabc0f4ebb79" containerName="dnsmasq-dns" containerID="cri-o://7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9" gracePeriod=10 Oct 02 16:38:32 crc kubenswrapper[4882]: I1002 16:38:32.464581 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerName="cinder-scheduler" containerID="cri-o://2227ef492f6b2d13f710d384f3d98fdd696edeeda9ecb9009912a2d6d56143d2" gracePeriod=30 Oct 02 16:38:32 crc kubenswrapper[4882]: I1002 16:38:32.464979 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 16:38:32 crc kubenswrapper[4882]: I1002 16:38:32.465037 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerName="probe" containerID="cri-o://fa70877f037a2b551f127b58f743f62d2cb6287cbea5a9f461cd1bd0f952ec86" gracePeriod=30 Oct 02 16:38:32 crc kubenswrapper[4882]: I1002 16:38:32.571239 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" podUID="806e2852-f670-4614-9838-cabc0f4ebb79" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Oct 02 16:38:32 crc kubenswrapper[4882]: I1002 16:38:32.900584 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.025930 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.181790 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-nb\") pod \"806e2852-f670-4614-9838-cabc0f4ebb79\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.181860 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-config\") pod \"806e2852-f670-4614-9838-cabc0f4ebb79\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.182164 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-svc\") pod \"806e2852-f670-4614-9838-cabc0f4ebb79\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.182192 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-swift-storage-0\") pod \"806e2852-f670-4614-9838-cabc0f4ebb79\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.182301 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-sb\") pod \"806e2852-f670-4614-9838-cabc0f4ebb79\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.182366 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2jxp\" (UniqueName: \"kubernetes.io/projected/806e2852-f670-4614-9838-cabc0f4ebb79-kube-api-access-q2jxp\") pod \"806e2852-f670-4614-9838-cabc0f4ebb79\" (UID: \"806e2852-f670-4614-9838-cabc0f4ebb79\") " Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.199691 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806e2852-f670-4614-9838-cabc0f4ebb79-kube-api-access-q2jxp" (OuterVolumeSpecName: "kube-api-access-q2jxp") pod "806e2852-f670-4614-9838-cabc0f4ebb79" (UID: "806e2852-f670-4614-9838-cabc0f4ebb79"). InnerVolumeSpecName "kube-api-access-q2jxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.286171 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2jxp\" (UniqueName: \"kubernetes.io/projected/806e2852-f670-4614-9838-cabc0f4ebb79-kube-api-access-q2jxp\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.298820 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "806e2852-f670-4614-9838-cabc0f4ebb79" (UID: "806e2852-f670-4614-9838-cabc0f4ebb79"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.309738 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "806e2852-f670-4614-9838-cabc0f4ebb79" (UID: "806e2852-f670-4614-9838-cabc0f4ebb79"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.314851 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "806e2852-f670-4614-9838-cabc0f4ebb79" (UID: "806e2852-f670-4614-9838-cabc0f4ebb79"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.338793 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-config" (OuterVolumeSpecName: "config") pod "806e2852-f670-4614-9838-cabc0f4ebb79" (UID: "806e2852-f670-4614-9838-cabc0f4ebb79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.339092 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "806e2852-f670-4614-9838-cabc0f4ebb79" (UID: "806e2852-f670-4614-9838-cabc0f4ebb79"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.343594 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.356803 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-568599c566-7t7js" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.388025 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.388532 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.388633 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.388705 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.388775 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/806e2852-f670-4614-9838-cabc0f4ebb79-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.492031 4882 generic.go:334] "Generic (PLEG): container finished" podID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerID="fa70877f037a2b551f127b58f743f62d2cb6287cbea5a9f461cd1bd0f952ec86" exitCode=0 Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.492149 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e","Type":"ContainerDied","Data":"fa70877f037a2b551f127b58f743f62d2cb6287cbea5a9f461cd1bd0f952ec86"} Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.518521 4882 generic.go:334] "Generic (PLEG): container finished" podID="806e2852-f670-4614-9838-cabc0f4ebb79" containerID="7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9" exitCode=0 Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.519410 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.519729 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" event={"ID":"806e2852-f670-4614-9838-cabc0f4ebb79","Type":"ContainerDied","Data":"7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9"} Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.519766 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c9d7754df-vk4cc" event={"ID":"806e2852-f670-4614-9838-cabc0f4ebb79","Type":"ContainerDied","Data":"4809120486a155d6cc7e141710fb0c02a747072924c746183dea5b9eed03cd2c"} Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.519788 4882 scope.go:117] "RemoveContainer" containerID="7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.602665 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c9d7754df-vk4cc"] Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.605922 4882 scope.go:117] "RemoveContainer" containerID="75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.623168 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c9d7754df-vk4cc"] Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.654696 4882 scope.go:117] "RemoveContainer" containerID="7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9" Oct 02 16:38:33 crc kubenswrapper[4882]: E1002 16:38:33.656988 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9\": container with ID starting with 7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9 not found: ID does not exist" containerID="7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.657243 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9"} err="failed to get container status \"7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9\": rpc error: code = NotFound desc = could not find container \"7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9\": container with ID starting with 7d96df1cd1431a735268c456fe298bf2b84cdce47f539d5557d257f76bb853d9 not found: ID does not exist" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.657374 4882 scope.go:117] "RemoveContainer" containerID="75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921" Oct 02 16:38:33 crc kubenswrapper[4882]: E1002 16:38:33.658495 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921\": container with ID starting with 75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921 not found: ID does not exist" containerID="75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921" Oct 02 16:38:33 crc kubenswrapper[4882]: I1002 16:38:33.658666 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921"} err="failed to get container status \"75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921\": rpc error: code = NotFound desc = could not find container \"75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921\": container with ID starting with 75957578990ccc1bd909d000ddfed57d6ae7b9f79428cbc60566f34e20cd7921 not found: ID does not exist" Oct 02 16:38:34 crc kubenswrapper[4882]: I1002 16:38:34.581872 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:34 crc kubenswrapper[4882]: I1002 16:38:34.773034 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="806e2852-f670-4614-9838-cabc0f4ebb79" path="/var/lib/kubelet/pods/806e2852-f670-4614-9838-cabc0f4ebb79/volumes" Oct 02 16:38:35 crc kubenswrapper[4882]: I1002 16:38:35.020830 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.786226 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-54f5c99697-qjljg"] Oct 02 16:38:36 crc kubenswrapper[4882]: E1002 16:38:36.787017 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806e2852-f670-4614-9838-cabc0f4ebb79" containerName="init" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.787041 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="806e2852-f670-4614-9838-cabc0f4ebb79" containerName="init" Oct 02 16:38:36 crc kubenswrapper[4882]: E1002 16:38:36.787054 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806e2852-f670-4614-9838-cabc0f4ebb79" containerName="dnsmasq-dns" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.787062 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="806e2852-f670-4614-9838-cabc0f4ebb79" containerName="dnsmasq-dns" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.795373 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="806e2852-f670-4614-9838-cabc0f4ebb79" containerName="dnsmasq-dns" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.796874 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.802640 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.802796 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.805312 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.823527 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-54f5c99697-qjljg"] Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.866444 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-etc-swift\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.866501 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-config-data\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.866525 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z22qw\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-kube-api-access-z22qw\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.866550 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-log-httpd\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.866592 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-public-tls-certs\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.866615 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-run-httpd\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.866646 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-combined-ca-bundle\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.866698 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-internal-tls-certs\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.967890 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-internal-tls-certs\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.967968 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-etc-swift\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.968003 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-config-data\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.968021 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z22qw\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-kube-api-access-z22qw\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.968042 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-log-httpd\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.968080 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-public-tls-certs\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.968103 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-run-httpd\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.968132 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-combined-ca-bundle\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.969393 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-run-httpd\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.969452 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-log-httpd\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.980887 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-etc-swift\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.993064 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-combined-ca-bundle\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.995742 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-internal-tls-certs\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.996450 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-config-data\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:36 crc kubenswrapper[4882]: I1002 16:38:36.998919 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-public-tls-certs\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.045127 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z22qw\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-kube-api-access-z22qw\") pod \"swift-proxy-54f5c99697-qjljg\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.120344 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.576420 4882 generic.go:334] "Generic (PLEG): container finished" podID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerID="2227ef492f6b2d13f710d384f3d98fdd696edeeda9ecb9009912a2d6d56143d2" exitCode=0 Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.576675 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e","Type":"ContainerDied","Data":"2227ef492f6b2d13f710d384f3d98fdd696edeeda9ecb9009912a2d6d56143d2"} Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.633383 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.685401 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-scripts\") pod \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.685459 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data-custom\") pod \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.685538 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddrzk\" (UniqueName: \"kubernetes.io/projected/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-kube-api-access-ddrzk\") pod \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.685597 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-etc-machine-id\") pod \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.685616 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-combined-ca-bundle\") pod \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.685892 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" (UID: "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.685982 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data\") pod \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\" (UID: \"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e\") " Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.686468 4882 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.691508 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-kube-api-access-ddrzk" (OuterVolumeSpecName: "kube-api-access-ddrzk") pod "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" (UID: "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e"). InnerVolumeSpecName "kube-api-access-ddrzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.692411 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" (UID: "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.730415 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-scripts" (OuterVolumeSpecName: "scripts") pod "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" (UID: "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.781503 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" (UID: "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.787336 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.787381 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.787395 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddrzk\" (UniqueName: \"kubernetes.io/projected/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-kube-api-access-ddrzk\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.787404 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.834157 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 16:38:37 crc kubenswrapper[4882]: E1002 16:38:37.834969 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerName="cinder-scheduler" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.835008 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerName="cinder-scheduler" Oct 02 16:38:37 crc kubenswrapper[4882]: E1002 16:38:37.835063 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerName="probe" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.835071 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerName="probe" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.835356 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerName="cinder-scheduler" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.835431 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" containerName="probe" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.836602 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.844811 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.844921 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.845022 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-54n7z" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.854188 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.886424 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data" (OuterVolumeSpecName: "config-data") pod "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" (UID: "4f9e67dc-b133-4a3f-8a9d-4f6f457a682e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.891665 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.891745 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.891820 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68dg\" (UniqueName: \"kubernetes.io/projected/ed84812d-565f-4ffc-a886-8cbddb32db0e-kube-api-access-f68dg\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.891865 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config-secret\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.891964 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.905824 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-54f5c99697-qjljg"] Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.993432 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config-secret\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.993611 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.993638 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.993696 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68dg\" (UniqueName: \"kubernetes.io/projected/ed84812d-565f-4ffc-a886-8cbddb32db0e-kube-api-access-f68dg\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:37 crc kubenswrapper[4882]: I1002 16:38:37.995369 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.000591 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config-secret\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.001878 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.017874 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68dg\" (UniqueName: \"kubernetes.io/projected/ed84812d-565f-4ffc-a886-8cbddb32db0e-kube-api-access-f68dg\") pod \"openstackclient\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " pod="openstack/openstackclient" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.210325 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.599309 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f9e67dc-b133-4a3f-8a9d-4f6f457a682e","Type":"ContainerDied","Data":"79e0e8af47654b155118835e5f21fa8b7d9472519d2ff181b14a5a5b0039a9e1"} Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.599695 4882 scope.go:117] "RemoveContainer" containerID="fa70877f037a2b551f127b58f743f62d2cb6287cbea5a9f461cd1bd0f952ec86" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.599894 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.606297 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54f5c99697-qjljg" event={"ID":"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636","Type":"ContainerStarted","Data":"130c4a62c2b640eaef7a5ead25af6e7d9449b34010c056f857dac87df15f4c2a"} Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.606360 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54f5c99697-qjljg" event={"ID":"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636","Type":"ContainerStarted","Data":"2c29c361246fc5eb606a4aa1b6de2f64776e3b080d6fc8470369063ff5f186c2"} Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.624961 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.708106 4882 scope.go:117] "RemoveContainer" containerID="2227ef492f6b2d13f710d384f3d98fdd696edeeda9ecb9009912a2d6d56143d2" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.716787 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.737364 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.746916 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.781746 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.797951 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.839149 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9e67dc-b133-4a3f-8a9d-4f6f457a682e" path="/var/lib/kubelet/pods/4f9e67dc-b133-4a3f-8a9d-4f6f457a682e/volumes" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.840077 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.910609 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.910708 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.910793 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.910814 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpncj\" (UniqueName: \"kubernetes.io/projected/af2f3ba0-9530-4875-84e1-df99cc4761a6-kube-api-access-kpncj\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.910837 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af2f3ba0-9530-4875-84e1-df99cc4761a6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:38 crc kubenswrapper[4882]: I1002 16:38:38.910917 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-scripts\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.012547 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-scripts\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.012649 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.012726 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.012770 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.012790 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpncj\" (UniqueName: \"kubernetes.io/projected/af2f3ba0-9530-4875-84e1-df99cc4761a6-kube-api-access-kpncj\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.012809 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af2f3ba0-9530-4875-84e1-df99cc4761a6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.012885 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af2f3ba0-9530-4875-84e1-df99cc4761a6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.018775 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.022183 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.025863 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-scripts\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.026830 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.041864 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpncj\" (UniqueName: \"kubernetes.io/projected/af2f3ba0-9530-4875-84e1-df99cc4761a6-kube-api-access-kpncj\") pod \"cinder-scheduler-0\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.138541 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.632050 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54f5c99697-qjljg" event={"ID":"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636","Type":"ContainerStarted","Data":"df07f33c6b476688e48e7aac7112b9a252910e43159a82f95d8b350ed2eaa481"} Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.632393 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.632416 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.639135 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ed84812d-565f-4ffc-a886-8cbddb32db0e","Type":"ContainerStarted","Data":"cbae1c2a51fd70da5bdd23726460e892a8a5e6beed83ff803333c4e2e835baff"} Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.671462 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-54f5c99697-qjljg" podStartSLOduration=3.671433334 podStartE2EDuration="3.671433334s" podCreationTimestamp="2025-10-02 16:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:39.657458221 +0000 UTC m=+1278.406687758" watchObservedRunningTime="2025-10-02 16:38:39.671433334 +0000 UTC m=+1278.420662861" Oct 02 16:38:39 crc kubenswrapper[4882]: I1002 16:38:39.818613 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:38:40 crc kubenswrapper[4882]: I1002 16:38:40.458989 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:40 crc kubenswrapper[4882]: I1002 16:38:40.459892 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="ceilometer-central-agent" containerID="cri-o://e4a13efc451885f02f25e9cf364db2fec60e58e398a693cacb7fa46d56f790f1" gracePeriod=30 Oct 02 16:38:40 crc kubenswrapper[4882]: I1002 16:38:40.460074 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="proxy-httpd" containerID="cri-o://d2bca7c8b96365280c73a190e909e3383d53ecb7ff05b6f44fb139bd7006a923" gracePeriod=30 Oct 02 16:38:40 crc kubenswrapper[4882]: I1002 16:38:40.460128 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="sg-core" containerID="cri-o://994e1db5b6ed85179232e2d376a1b3cac69d434ee1a4c628383422c9e26e22cf" gracePeriod=30 Oct 02 16:38:40 crc kubenswrapper[4882]: I1002 16:38:40.460162 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="ceilometer-notification-agent" containerID="cri-o://a60e6c9cdcc2714bfb5170a13236c8bfb097fd20cb13a05ab49aa870f71ada34" gracePeriod=30 Oct 02 16:38:40 crc kubenswrapper[4882]: I1002 16:38:40.486770 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 16:38:40 crc kubenswrapper[4882]: I1002 16:38:40.680625 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"af2f3ba0-9530-4875-84e1-df99cc4761a6","Type":"ContainerStarted","Data":"a34587d375e63d5807bf75834c5061dd3c3db2e8d13809c6042eb23945306774"} Oct 02 16:38:41 crc kubenswrapper[4882]: I1002 16:38:41.672720 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:41 crc kubenswrapper[4882]: I1002 16:38:41.708361 4882 generic.go:334] "Generic (PLEG): container finished" podID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerID="d2bca7c8b96365280c73a190e909e3383d53ecb7ff05b6f44fb139bd7006a923" exitCode=0 Oct 02 16:38:41 crc kubenswrapper[4882]: I1002 16:38:41.708410 4882 generic.go:334] "Generic (PLEG): container finished" podID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerID="994e1db5b6ed85179232e2d376a1b3cac69d434ee1a4c628383422c9e26e22cf" exitCode=2 Oct 02 16:38:41 crc kubenswrapper[4882]: I1002 16:38:41.708422 4882 generic.go:334] "Generic (PLEG): container finished" podID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerID="e4a13efc451885f02f25e9cf364db2fec60e58e398a693cacb7fa46d56f790f1" exitCode=0 Oct 02 16:38:41 crc kubenswrapper[4882]: I1002 16:38:41.708489 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerDied","Data":"d2bca7c8b96365280c73a190e909e3383d53ecb7ff05b6f44fb139bd7006a923"} Oct 02 16:38:41 crc kubenswrapper[4882]: I1002 16:38:41.708528 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerDied","Data":"994e1db5b6ed85179232e2d376a1b3cac69d434ee1a4c628383422c9e26e22cf"} Oct 02 16:38:41 crc kubenswrapper[4882]: I1002 16:38:41.708542 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerDied","Data":"e4a13efc451885f02f25e9cf364db2fec60e58e398a693cacb7fa46d56f790f1"} Oct 02 16:38:41 crc kubenswrapper[4882]: I1002 16:38:41.713711 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"af2f3ba0-9530-4875-84e1-df99cc4761a6","Type":"ContainerStarted","Data":"8b9f83a977cbb31784f8c0ea343e36bdd7f6e161aef6bbb69620207d95bcb979"} Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.341040 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.445029 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75654db7dd-lcgbh"] Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.445342 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75654db7dd-lcgbh" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api-log" containerID="cri-o://ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689" gracePeriod=30 Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.445989 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75654db7dd-lcgbh" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api" containerID="cri-o://c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477" gracePeriod=30 Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.465993 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75654db7dd-lcgbh" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.770081 4882 generic.go:334] "Generic (PLEG): container finished" podID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerID="ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689" exitCode=143 Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.794854 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75654db7dd-lcgbh" event={"ID":"841fb0d0-b75f-4a4f-8418-85891aab0cf2","Type":"ContainerDied","Data":"ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689"} Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.808671 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"af2f3ba0-9530-4875-84e1-df99cc4761a6","Type":"ContainerStarted","Data":"08bb0f006c953ee6edc2f62804afed6ad5278a9319ec73877534ab1ea24041e5"} Oct 02 16:38:42 crc kubenswrapper[4882]: I1002 16:38:42.854478 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.854455407 podStartE2EDuration="4.854455407s" podCreationTimestamp="2025-10-02 16:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:38:42.851742018 +0000 UTC m=+1281.600971545" watchObservedRunningTime="2025-10-02 16:38:42.854455407 +0000 UTC m=+1281.603684934" Oct 02 16:38:43 crc kubenswrapper[4882]: I1002 16:38:43.478208 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 16:38:44 crc kubenswrapper[4882]: I1002 16:38:44.139389 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 16:38:45 crc kubenswrapper[4882]: I1002 16:38:45.866848 4882 generic.go:334] "Generic (PLEG): container finished" podID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerID="a60e6c9cdcc2714bfb5170a13236c8bfb097fd20cb13a05ab49aa870f71ada34" exitCode=0 Oct 02 16:38:45 crc kubenswrapper[4882]: I1002 16:38:45.867384 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerDied","Data":"a60e6c9cdcc2714bfb5170a13236c8bfb097fd20cb13a05ab49aa870f71ada34"} Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.303034 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.407495 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.414486 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-config-data\") pod \"0264e02a-1dfc-4dc4-8285-bc40e270916d\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.414528 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-scripts\") pod \"0264e02a-1dfc-4dc4-8285-bc40e270916d\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.414588 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-run-httpd\") pod \"0264e02a-1dfc-4dc4-8285-bc40e270916d\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.414667 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr5vb\" (UniqueName: \"kubernetes.io/projected/0264e02a-1dfc-4dc4-8285-bc40e270916d-kube-api-access-dr5vb\") pod \"0264e02a-1dfc-4dc4-8285-bc40e270916d\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.414766 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-combined-ca-bundle\") pod \"0264e02a-1dfc-4dc4-8285-bc40e270916d\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.414793 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-sg-core-conf-yaml\") pod \"0264e02a-1dfc-4dc4-8285-bc40e270916d\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.414815 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-log-httpd\") pod \"0264e02a-1dfc-4dc4-8285-bc40e270916d\" (UID: \"0264e02a-1dfc-4dc4-8285-bc40e270916d\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.415725 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0264e02a-1dfc-4dc4-8285-bc40e270916d" (UID: "0264e02a-1dfc-4dc4-8285-bc40e270916d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.415937 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0264e02a-1dfc-4dc4-8285-bc40e270916d" (UID: "0264e02a-1dfc-4dc4-8285-bc40e270916d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.425444 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-scripts" (OuterVolumeSpecName: "scripts") pod "0264e02a-1dfc-4dc4-8285-bc40e270916d" (UID: "0264e02a-1dfc-4dc4-8285-bc40e270916d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.429346 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0264e02a-1dfc-4dc4-8285-bc40e270916d-kube-api-access-dr5vb" (OuterVolumeSpecName: "kube-api-access-dr5vb") pod "0264e02a-1dfc-4dc4-8285-bc40e270916d" (UID: "0264e02a-1dfc-4dc4-8285-bc40e270916d"). InnerVolumeSpecName "kube-api-access-dr5vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.455514 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0264e02a-1dfc-4dc4-8285-bc40e270916d" (UID: "0264e02a-1dfc-4dc4-8285-bc40e270916d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.516596 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-combined-ca-bundle\") pod \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.516717 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data-custom\") pod \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.516762 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5dhn\" (UniqueName: \"kubernetes.io/projected/841fb0d0-b75f-4a4f-8418-85891aab0cf2-kube-api-access-t5dhn\") pod \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.516850 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/841fb0d0-b75f-4a4f-8418-85891aab0cf2-logs\") pod \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.516986 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data\") pod \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\" (UID: \"841fb0d0-b75f-4a4f-8418-85891aab0cf2\") " Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.517446 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.517468 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.517482 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr5vb\" (UniqueName: \"kubernetes.io/projected/0264e02a-1dfc-4dc4-8285-bc40e270916d-kube-api-access-dr5vb\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.517496 4882 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.517508 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0264e02a-1dfc-4dc4-8285-bc40e270916d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.519380 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/841fb0d0-b75f-4a4f-8418-85891aab0cf2-logs" (OuterVolumeSpecName: "logs") pod "841fb0d0-b75f-4a4f-8418-85891aab0cf2" (UID: "841fb0d0-b75f-4a4f-8418-85891aab0cf2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.521559 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "841fb0d0-b75f-4a4f-8418-85891aab0cf2" (UID: "841fb0d0-b75f-4a4f-8418-85891aab0cf2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.538317 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0264e02a-1dfc-4dc4-8285-bc40e270916d" (UID: "0264e02a-1dfc-4dc4-8285-bc40e270916d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.541451 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841fb0d0-b75f-4a4f-8418-85891aab0cf2-kube-api-access-t5dhn" (OuterVolumeSpecName: "kube-api-access-t5dhn") pod "841fb0d0-b75f-4a4f-8418-85891aab0cf2" (UID: "841fb0d0-b75f-4a4f-8418-85891aab0cf2"). InnerVolumeSpecName "kube-api-access-t5dhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.557803 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "841fb0d0-b75f-4a4f-8418-85891aab0cf2" (UID: "841fb0d0-b75f-4a4f-8418-85891aab0cf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.566373 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-config-data" (OuterVolumeSpecName: "config-data") pod "0264e02a-1dfc-4dc4-8285-bc40e270916d" (UID: "0264e02a-1dfc-4dc4-8285-bc40e270916d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.592362 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data" (OuterVolumeSpecName: "config-data") pod "841fb0d0-b75f-4a4f-8418-85891aab0cf2" (UID: "841fb0d0-b75f-4a4f-8418-85891aab0cf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.621202 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.621275 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5dhn\" (UniqueName: \"kubernetes.io/projected/841fb0d0-b75f-4a4f-8418-85891aab0cf2-kube-api-access-t5dhn\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.621296 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.621313 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/841fb0d0-b75f-4a4f-8418-85891aab0cf2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.621327 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0264e02a-1dfc-4dc4-8285-bc40e270916d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.621339 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.621349 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841fb0d0-b75f-4a4f-8418-85891aab0cf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.884154 4882 generic.go:334] "Generic (PLEG): container finished" podID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerID="c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477" exitCode=0 Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.884286 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75654db7dd-lcgbh" event={"ID":"841fb0d0-b75f-4a4f-8418-85891aab0cf2","Type":"ContainerDied","Data":"c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477"} Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.884775 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75654db7dd-lcgbh" event={"ID":"841fb0d0-b75f-4a4f-8418-85891aab0cf2","Type":"ContainerDied","Data":"d664c084d0bf488bfca4c7d68739efa53ed15f711e4096faebc108a1a80c4342"} Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.884804 4882 scope.go:117] "RemoveContainer" containerID="c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.884396 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75654db7dd-lcgbh" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.894957 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0264e02a-1dfc-4dc4-8285-bc40e270916d","Type":"ContainerDied","Data":"ff9d994a17bd00c403f560fcb65cf12cebbefad3ac60fff4588dcaac5b747b03"} Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.895055 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.921281 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75654db7dd-lcgbh"] Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.929885 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-75654db7dd-lcgbh"] Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.940553 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.952276 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.961255 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:46 crc kubenswrapper[4882]: E1002 16:38:46.961703 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="ceilometer-notification-agent" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.961717 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="ceilometer-notification-agent" Oct 02 16:38:46 crc kubenswrapper[4882]: E1002 16:38:46.961745 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="ceilometer-central-agent" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.961752 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="ceilometer-central-agent" Oct 02 16:38:46 crc kubenswrapper[4882]: E1002 16:38:46.961768 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.961774 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api" Oct 02 16:38:46 crc kubenswrapper[4882]: E1002 16:38:46.961785 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="proxy-httpd" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.961792 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="proxy-httpd" Oct 02 16:38:46 crc kubenswrapper[4882]: E1002 16:38:46.961807 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api-log" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.961813 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api-log" Oct 02 16:38:46 crc kubenswrapper[4882]: E1002 16:38:46.961833 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="sg-core" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.961838 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="sg-core" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.962020 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="ceilometer-notification-agent" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.962041 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="ceilometer-central-agent" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.962056 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="sg-core" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.962068 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.962079 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" containerName="proxy-httpd" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.962092 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" containerName="barbican-api-log" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.963763 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.966147 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.966396 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 16:38:46 crc kubenswrapper[4882]: I1002 16:38:46.970733 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.045405 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-run-httpd\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.045454 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.045529 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-scripts\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.045548 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-config-data\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.045575 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.045610 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729kw\" (UniqueName: \"kubernetes.io/projected/734b4cae-4bbc-44de-9a7a-81934cffa9be-kube-api-access-729kw\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.045638 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-log-httpd\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.126272 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.127153 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.150394 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-scripts\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.150453 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-config-data\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.150493 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.150540 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729kw\" (UniqueName: \"kubernetes.io/projected/734b4cae-4bbc-44de-9a7a-81934cffa9be-kube-api-access-729kw\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.150581 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-log-httpd\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.150642 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-run-httpd\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.150670 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.151519 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-run-httpd\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.151599 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-log-httpd\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.157564 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-scripts\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.158188 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-config-data\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.163325 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.166572 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.176597 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729kw\" (UniqueName: \"kubernetes.io/projected/734b4cae-4bbc-44de-9a7a-81934cffa9be-kube-api-access-729kw\") pod \"ceilometer-0\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.281859 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:38:47 crc kubenswrapper[4882]: I1002 16:38:47.315618 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:48 crc kubenswrapper[4882]: I1002 16:38:48.015798 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:48 crc kubenswrapper[4882]: I1002 16:38:48.770751 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0264e02a-1dfc-4dc4-8285-bc40e270916d" path="/var/lib/kubelet/pods/0264e02a-1dfc-4dc4-8285-bc40e270916d/volumes" Oct 02 16:38:48 crc kubenswrapper[4882]: I1002 16:38:48.772237 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841fb0d0-b75f-4a4f-8418-85891aab0cf2" path="/var/lib/kubelet/pods/841fb0d0-b75f-4a4f-8418-85891aab0cf2/volumes" Oct 02 16:38:49 crc kubenswrapper[4882]: I1002 16:38:49.366415 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 16:38:50 crc kubenswrapper[4882]: I1002 16:38:50.268525 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:38:50 crc kubenswrapper[4882]: I1002 16:38:50.330668 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fb74689f6-xjtxd"] Oct 02 16:38:50 crc kubenswrapper[4882]: I1002 16:38:50.330913 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fb74689f6-xjtxd" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerName="neutron-api" containerID="cri-o://7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693" gracePeriod=30 Oct 02 16:38:50 crc kubenswrapper[4882]: I1002 16:38:50.331329 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fb74689f6-xjtxd" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerName="neutron-httpd" containerID="cri-o://2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5" gracePeriod=30 Oct 02 16:38:50 crc kubenswrapper[4882]: I1002 16:38:50.957061 4882 generic.go:334] "Generic (PLEG): container finished" podID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerID="2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5" exitCode=0 Oct 02 16:38:50 crc kubenswrapper[4882]: I1002 16:38:50.957139 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb74689f6-xjtxd" event={"ID":"a7ab9abe-db2d-4e60-be2f-89eef3908282","Type":"ContainerDied","Data":"2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5"} Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.575021 4882 scope.go:117] "RemoveContainer" containerID="ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689" Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.754077 4882 scope.go:117] "RemoveContainer" containerID="c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477" Oct 02 16:38:53 crc kubenswrapper[4882]: E1002 16:38:53.755641 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477\": container with ID starting with c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477 not found: ID does not exist" containerID="c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477" Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.755713 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477"} err="failed to get container status \"c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477\": rpc error: code = NotFound desc = could not find container \"c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477\": container with ID starting with c398bdff85d6f8184cac67c0b6576e3bac72083aef1be1b531dc12b61621e477 not found: ID does not exist" Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.755754 4882 scope.go:117] "RemoveContainer" containerID="ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689" Oct 02 16:38:53 crc kubenswrapper[4882]: E1002 16:38:53.756343 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689\": container with ID starting with ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689 not found: ID does not exist" containerID="ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689" Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.756388 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689"} err="failed to get container status \"ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689\": rpc error: code = NotFound desc = could not find container \"ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689\": container with ID starting with ffe59ee746e00368f394bf1054a0d429ddc3ba9521c9bbbf047d179acf3ef689 not found: ID does not exist" Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.756432 4882 scope.go:117] "RemoveContainer" containerID="d2bca7c8b96365280c73a190e909e3383d53ecb7ff05b6f44fb139bd7006a923" Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.897068 4882 scope.go:117] "RemoveContainer" containerID="994e1db5b6ed85179232e2d376a1b3cac69d434ee1a4c628383422c9e26e22cf" Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.930579 4882 scope.go:117] "RemoveContainer" containerID="a60e6c9cdcc2714bfb5170a13236c8bfb097fd20cb13a05ab49aa870f71ada34" Oct 02 16:38:53 crc kubenswrapper[4882]: I1002 16:38:53.961786 4882 scope.go:117] "RemoveContainer" containerID="e4a13efc451885f02f25e9cf364db2fec60e58e398a693cacb7fa46d56f790f1" Oct 02 16:38:54 crc kubenswrapper[4882]: I1002 16:38:54.003686 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ed84812d-565f-4ffc-a886-8cbddb32db0e","Type":"ContainerStarted","Data":"3e8912cefdcd596bcc23c33fee09e9e99d8e66bf4762de8ad88089cdb917b1ac"} Oct 02 16:38:54 crc kubenswrapper[4882]: I1002 16:38:54.020156 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.877441036 podStartE2EDuration="17.020135028s" podCreationTimestamp="2025-10-02 16:38:37 +0000 UTC" firstStartedPulling="2025-10-02 16:38:38.614544577 +0000 UTC m=+1277.363774104" lastFinishedPulling="2025-10-02 16:38:53.757238569 +0000 UTC m=+1292.506468096" observedRunningTime="2025-10-02 16:38:54.018341352 +0000 UTC m=+1292.767570879" watchObservedRunningTime="2025-10-02 16:38:54.020135028 +0000 UTC m=+1292.769364565" Oct 02 16:38:54 crc kubenswrapper[4882]: I1002 16:38:54.074331 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:38:54 crc kubenswrapper[4882]: W1002 16:38:54.079968 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod734b4cae_4bbc_44de_9a7a_81934cffa9be.slice/crio-28eb87f3d7b40daab71d162e554df8ebd4702022e0cda46994af90788a5c65a6 WatchSource:0}: Error finding container 28eb87f3d7b40daab71d162e554df8ebd4702022e0cda46994af90788a5c65a6: Status 404 returned error can't find the container with id 28eb87f3d7b40daab71d162e554df8ebd4702022e0cda46994af90788a5c65a6 Oct 02 16:38:54 crc kubenswrapper[4882]: I1002 16:38:54.921780 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.010652 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-combined-ca-bundle\") pod \"a7ab9abe-db2d-4e60-be2f-89eef3908282\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.010727 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-httpd-config\") pod \"a7ab9abe-db2d-4e60-be2f-89eef3908282\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.010786 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2qf\" (UniqueName: \"kubernetes.io/projected/a7ab9abe-db2d-4e60-be2f-89eef3908282-kube-api-access-vr2qf\") pod \"a7ab9abe-db2d-4e60-be2f-89eef3908282\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.010814 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-config\") pod \"a7ab9abe-db2d-4e60-be2f-89eef3908282\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.010881 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-ovndb-tls-certs\") pod \"a7ab9abe-db2d-4e60-be2f-89eef3908282\" (UID: \"a7ab9abe-db2d-4e60-be2f-89eef3908282\") " Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.024328 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerStarted","Data":"b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887"} Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.024423 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerStarted","Data":"28eb87f3d7b40daab71d162e554df8ebd4702022e0cda46994af90788a5c65a6"} Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.025894 4882 generic.go:334] "Generic (PLEG): container finished" podID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerID="7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693" exitCode=0 Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.026856 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb74689f6-xjtxd" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.027062 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb74689f6-xjtxd" event={"ID":"a7ab9abe-db2d-4e60-be2f-89eef3908282","Type":"ContainerDied","Data":"7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693"} Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.027185 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb74689f6-xjtxd" event={"ID":"a7ab9abe-db2d-4e60-be2f-89eef3908282","Type":"ContainerDied","Data":"eda32871d96b18dcb6bd67cc00dd51e48e9da6a42514e4454a22348ea045ae08"} Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.027268 4882 scope.go:117] "RemoveContainer" containerID="2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.033085 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ab9abe-db2d-4e60-be2f-89eef3908282-kube-api-access-vr2qf" (OuterVolumeSpecName: "kube-api-access-vr2qf") pod "a7ab9abe-db2d-4e60-be2f-89eef3908282" (UID: "a7ab9abe-db2d-4e60-be2f-89eef3908282"). InnerVolumeSpecName "kube-api-access-vr2qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.040728 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a7ab9abe-db2d-4e60-be2f-89eef3908282" (UID: "a7ab9abe-db2d-4e60-be2f-89eef3908282"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.068811 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7ab9abe-db2d-4e60-be2f-89eef3908282" (UID: "a7ab9abe-db2d-4e60-be2f-89eef3908282"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.070905 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-config" (OuterVolumeSpecName: "config") pod "a7ab9abe-db2d-4e60-be2f-89eef3908282" (UID: "a7ab9abe-db2d-4e60-be2f-89eef3908282"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.084960 4882 scope.go:117] "RemoveContainer" containerID="7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.097188 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a7ab9abe-db2d-4e60-be2f-89eef3908282" (UID: "a7ab9abe-db2d-4e60-be2f-89eef3908282"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.110402 4882 scope.go:117] "RemoveContainer" containerID="2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5" Oct 02 16:38:55 crc kubenswrapper[4882]: E1002 16:38:55.111038 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5\": container with ID starting with 2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5 not found: ID does not exist" containerID="2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.111110 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5"} err="failed to get container status \"2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5\": rpc error: code = NotFound desc = could not find container \"2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5\": container with ID starting with 2d35d3829bf28ff24500f710b8c03d7fa1d9886689553230924230fd7a2cb0b5 not found: ID does not exist" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.111141 4882 scope.go:117] "RemoveContainer" containerID="7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.113426 4882 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.113460 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2qf\" (UniqueName: \"kubernetes.io/projected/a7ab9abe-db2d-4e60-be2f-89eef3908282-kube-api-access-vr2qf\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.113474 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.113505 4882 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.113518 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ab9abe-db2d-4e60-be2f-89eef3908282-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:38:55 crc kubenswrapper[4882]: E1002 16:38:55.114046 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693\": container with ID starting with 7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693 not found: ID does not exist" containerID="7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.114155 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693"} err="failed to get container status \"7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693\": rpc error: code = NotFound desc = could not find container \"7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693\": container with ID starting with 7e149e4f76e08aea6116122f405952c9589ac0a91225e57c0899f5bd4ff81693 not found: ID does not exist" Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.375681 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fb74689f6-xjtxd"] Oct 02 16:38:55 crc kubenswrapper[4882]: I1002 16:38:55.386423 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fb74689f6-xjtxd"] Oct 02 16:38:56 crc kubenswrapper[4882]: I1002 16:38:56.053158 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerStarted","Data":"e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87"} Oct 02 16:38:56 crc kubenswrapper[4882]: I1002 16:38:56.774707 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" path="/var/lib/kubelet/pods/a7ab9abe-db2d-4e60-be2f-89eef3908282/volumes" Oct 02 16:38:57 crc kubenswrapper[4882]: I1002 16:38:57.066051 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerStarted","Data":"14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097"} Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.079025 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerStarted","Data":"3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303"} Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.079760 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="ceilometer-central-agent" containerID="cri-o://b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887" gracePeriod=30 Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.079804 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="sg-core" containerID="cri-o://14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097" gracePeriod=30 Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.079818 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="proxy-httpd" containerID="cri-o://3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303" gracePeriod=30 Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.079849 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="ceilometer-notification-agent" containerID="cri-o://e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87" gracePeriod=30 Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.080173 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.442245 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.843338015 podStartE2EDuration="12.442206593s" podCreationTimestamp="2025-10-02 16:38:46 +0000 UTC" firstStartedPulling="2025-10-02 16:38:54.085982823 +0000 UTC m=+1292.835212350" lastFinishedPulling="2025-10-02 16:38:57.684851401 +0000 UTC m=+1296.434080928" observedRunningTime="2025-10-02 16:38:58.112307331 +0000 UTC m=+1296.861536858" watchObservedRunningTime="2025-10-02 16:38:58.442206593 +0000 UTC m=+1297.191436120" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.446601 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4z7p2"] Oct 02 16:38:58 crc kubenswrapper[4882]: E1002 16:38:58.446981 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerName="neutron-httpd" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.446999 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerName="neutron-httpd" Oct 02 16:38:58 crc kubenswrapper[4882]: E1002 16:38:58.447025 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerName="neutron-api" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.447031 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerName="neutron-api" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.447232 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerName="neutron-httpd" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.447245 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ab9abe-db2d-4e60-be2f-89eef3908282" containerName="neutron-api" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.447823 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4z7p2" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.483938 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4z7p2"] Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.577683 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdtd5\" (UniqueName: \"kubernetes.io/projected/3c0e5b41-6a96-453c-a2e1-b69c89186d5b-kube-api-access-mdtd5\") pod \"nova-api-db-create-4z7p2\" (UID: \"3c0e5b41-6a96-453c-a2e1-b69c89186d5b\") " pod="openstack/nova-api-db-create-4z7p2" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.620622 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gv7hs"] Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.621923 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gv7hs" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.634020 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gv7hs"] Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.679580 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtd5\" (UniqueName: \"kubernetes.io/projected/3c0e5b41-6a96-453c-a2e1-b69c89186d5b-kube-api-access-mdtd5\") pod \"nova-api-db-create-4z7p2\" (UID: \"3c0e5b41-6a96-453c-a2e1-b69c89186d5b\") " pod="openstack/nova-api-db-create-4z7p2" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.679870 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxnzc\" (UniqueName: \"kubernetes.io/projected/e6c754b5-9feb-467d-97e9-3ff17db97760-kube-api-access-zxnzc\") pod \"nova-cell0-db-create-gv7hs\" (UID: \"e6c754b5-9feb-467d-97e9-3ff17db97760\") " pod="openstack/nova-cell0-db-create-gv7hs" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.702799 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdtd5\" (UniqueName: \"kubernetes.io/projected/3c0e5b41-6a96-453c-a2e1-b69c89186d5b-kube-api-access-mdtd5\") pod \"nova-api-db-create-4z7p2\" (UID: \"3c0e5b41-6a96-453c-a2e1-b69c89186d5b\") " pod="openstack/nova-api-db-create-4z7p2" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.728184 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hqqxc"] Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.729639 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hqqxc" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.742295 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hqqxc"] Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.768727 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4z7p2" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.791538 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh55l\" (UniqueName: \"kubernetes.io/projected/ff35d0a9-776d-4b61-8714-c5632ef1f4a6-kube-api-access-xh55l\") pod \"nova-cell1-db-create-hqqxc\" (UID: \"ff35d0a9-776d-4b61-8714-c5632ef1f4a6\") " pod="openstack/nova-cell1-db-create-hqqxc" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.791743 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxnzc\" (UniqueName: \"kubernetes.io/projected/e6c754b5-9feb-467d-97e9-3ff17db97760-kube-api-access-zxnzc\") pod \"nova-cell0-db-create-gv7hs\" (UID: \"e6c754b5-9feb-467d-97e9-3ff17db97760\") " pod="openstack/nova-cell0-db-create-gv7hs" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.814687 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxnzc\" (UniqueName: \"kubernetes.io/projected/e6c754b5-9feb-467d-97e9-3ff17db97760-kube-api-access-zxnzc\") pod \"nova-cell0-db-create-gv7hs\" (UID: \"e6c754b5-9feb-467d-97e9-3ff17db97760\") " pod="openstack/nova-cell0-db-create-gv7hs" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.896963 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh55l\" (UniqueName: \"kubernetes.io/projected/ff35d0a9-776d-4b61-8714-c5632ef1f4a6-kube-api-access-xh55l\") pod \"nova-cell1-db-create-hqqxc\" (UID: \"ff35d0a9-776d-4b61-8714-c5632ef1f4a6\") " pod="openstack/nova-cell1-db-create-hqqxc" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.936255 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh55l\" (UniqueName: \"kubernetes.io/projected/ff35d0a9-776d-4b61-8714-c5632ef1f4a6-kube-api-access-xh55l\") pod \"nova-cell1-db-create-hqqxc\" (UID: \"ff35d0a9-776d-4b61-8714-c5632ef1f4a6\") " pod="openstack/nova-cell1-db-create-hqqxc" Oct 02 16:38:58 crc kubenswrapper[4882]: I1002 16:38:58.943835 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gv7hs" Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.093351 4882 generic.go:334] "Generic (PLEG): container finished" podID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerID="3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303" exitCode=0 Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.093405 4882 generic.go:334] "Generic (PLEG): container finished" podID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerID="14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097" exitCode=2 Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.093418 4882 generic.go:334] "Generic (PLEG): container finished" podID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerID="e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87" exitCode=0 Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.093446 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerDied","Data":"3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303"} Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.093483 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerDied","Data":"14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097"} Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.093496 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerDied","Data":"e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87"} Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.209291 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hqqxc" Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.266718 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4z7p2"] Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.417602 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gv7hs"] Oct 02 16:38:59 crc kubenswrapper[4882]: I1002 16:38:59.670498 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hqqxc"] Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.008344 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.114590 4882 generic.go:334] "Generic (PLEG): container finished" podID="3c0e5b41-6a96-453c-a2e1-b69c89186d5b" containerID="35f5965b2379272baa64a6b7b2f7efd500aef2a6fe7e1de3b2042137525d5921" exitCode=0 Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.114657 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4z7p2" event={"ID":"3c0e5b41-6a96-453c-a2e1-b69c89186d5b","Type":"ContainerDied","Data":"35f5965b2379272baa64a6b7b2f7efd500aef2a6fe7e1de3b2042137525d5921"} Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.114689 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4z7p2" event={"ID":"3c0e5b41-6a96-453c-a2e1-b69c89186d5b","Type":"ContainerStarted","Data":"d29e20f5d30cc30a3adc9bcd7e5f582f36483046915869f92db1fbc71e49b22a"} Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.117868 4882 generic.go:334] "Generic (PLEG): container finished" podID="e6c754b5-9feb-467d-97e9-3ff17db97760" containerID="3445a7bd3b9d7e8f1b81bd2cb1baba59e81f2eb716e5ad4d3e54e83b4ed8a5e2" exitCode=0 Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.117949 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gv7hs" event={"ID":"e6c754b5-9feb-467d-97e9-3ff17db97760","Type":"ContainerDied","Data":"3445a7bd3b9d7e8f1b81bd2cb1baba59e81f2eb716e5ad4d3e54e83b4ed8a5e2"} Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.117975 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gv7hs" event={"ID":"e6c754b5-9feb-467d-97e9-3ff17db97760","Type":"ContainerStarted","Data":"19198acfa4dc5d08c0c749bc6a79c11ffe1dea6ac12a42f961f0cf8039ed5e85"} Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.119672 4882 generic.go:334] "Generic (PLEG): container finished" podID="ff35d0a9-776d-4b61-8714-c5632ef1f4a6" containerID="f8ee3cae1e7ab82fde39e0ad973e86431ad255814bd39da4664b826387400de6" exitCode=0 Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.119738 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hqqxc" event={"ID":"ff35d0a9-776d-4b61-8714-c5632ef1f4a6","Type":"ContainerDied","Data":"f8ee3cae1e7ab82fde39e0ad973e86431ad255814bd39da4664b826387400de6"} Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.119770 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hqqxc" event={"ID":"ff35d0a9-776d-4b61-8714-c5632ef1f4a6","Type":"ContainerStarted","Data":"e5d918fcdadddfabf7d1db45c066b27f6f80524dac0fffa5c5d09e4a63794aa3"} Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.122147 4882 generic.go:334] "Generic (PLEG): container finished" podID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerID="b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887" exitCode=0 Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.122193 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerDied","Data":"b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887"} Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.122227 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.122244 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"734b4cae-4bbc-44de-9a7a-81934cffa9be","Type":"ContainerDied","Data":"28eb87f3d7b40daab71d162e554df8ebd4702022e0cda46994af90788a5c65a6"} Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.122263 4882 scope.go:117] "RemoveContainer" containerID="3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.127506 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-scripts\") pod \"734b4cae-4bbc-44de-9a7a-81934cffa9be\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.127556 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-config-data\") pod \"734b4cae-4bbc-44de-9a7a-81934cffa9be\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.127693 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-combined-ca-bundle\") pod \"734b4cae-4bbc-44de-9a7a-81934cffa9be\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.127749 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-sg-core-conf-yaml\") pod \"734b4cae-4bbc-44de-9a7a-81934cffa9be\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.127766 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-run-httpd\") pod \"734b4cae-4bbc-44de-9a7a-81934cffa9be\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.127787 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-log-httpd\") pod \"734b4cae-4bbc-44de-9a7a-81934cffa9be\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.127805 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-729kw\" (UniqueName: \"kubernetes.io/projected/734b4cae-4bbc-44de-9a7a-81934cffa9be-kube-api-access-729kw\") pod \"734b4cae-4bbc-44de-9a7a-81934cffa9be\" (UID: \"734b4cae-4bbc-44de-9a7a-81934cffa9be\") " Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.128849 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "734b4cae-4bbc-44de-9a7a-81934cffa9be" (UID: "734b4cae-4bbc-44de-9a7a-81934cffa9be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.129634 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "734b4cae-4bbc-44de-9a7a-81934cffa9be" (UID: "734b4cae-4bbc-44de-9a7a-81934cffa9be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.140616 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-scripts" (OuterVolumeSpecName: "scripts") pod "734b4cae-4bbc-44de-9a7a-81934cffa9be" (UID: "734b4cae-4bbc-44de-9a7a-81934cffa9be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.141531 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/734b4cae-4bbc-44de-9a7a-81934cffa9be-kube-api-access-729kw" (OuterVolumeSpecName: "kube-api-access-729kw") pod "734b4cae-4bbc-44de-9a7a-81934cffa9be" (UID: "734b4cae-4bbc-44de-9a7a-81934cffa9be"). InnerVolumeSpecName "kube-api-access-729kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.161401 4882 scope.go:117] "RemoveContainer" containerID="14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.169105 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "734b4cae-4bbc-44de-9a7a-81934cffa9be" (UID: "734b4cae-4bbc-44de-9a7a-81934cffa9be"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.222479 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "734b4cae-4bbc-44de-9a7a-81934cffa9be" (UID: "734b4cae-4bbc-44de-9a7a-81934cffa9be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.230734 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.230813 4882 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.230825 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.230836 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/734b4cae-4bbc-44de-9a7a-81934cffa9be-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.230847 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-729kw\" (UniqueName: \"kubernetes.io/projected/734b4cae-4bbc-44de-9a7a-81934cffa9be-kube-api-access-729kw\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.230861 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.237076 4882 scope.go:117] "RemoveContainer" containerID="e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.256522 4882 scope.go:117] "RemoveContainer" containerID="b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.272020 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-config-data" (OuterVolumeSpecName: "config-data") pod "734b4cae-4bbc-44de-9a7a-81934cffa9be" (UID: "734b4cae-4bbc-44de-9a7a-81934cffa9be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.282831 4882 scope.go:117] "RemoveContainer" containerID="3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303" Oct 02 16:39:00 crc kubenswrapper[4882]: E1002 16:39:00.283382 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303\": container with ID starting with 3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303 not found: ID does not exist" containerID="3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.283435 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303"} err="failed to get container status \"3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303\": rpc error: code = NotFound desc = could not find container \"3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303\": container with ID starting with 3e49f6f11ad980abbc2af2bcdef7895d0064ece8510a3b9f0326af3cda3ef303 not found: ID does not exist" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.283469 4882 scope.go:117] "RemoveContainer" containerID="14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097" Oct 02 16:39:00 crc kubenswrapper[4882]: E1002 16:39:00.283803 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097\": container with ID starting with 14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097 not found: ID does not exist" containerID="14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.283840 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097"} err="failed to get container status \"14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097\": rpc error: code = NotFound desc = could not find container \"14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097\": container with ID starting with 14f3e96c4191b4b8544ff5c43904231c9b6bdf9a99c6a81f28fb79fa8f973097 not found: ID does not exist" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.283865 4882 scope.go:117] "RemoveContainer" containerID="e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87" Oct 02 16:39:00 crc kubenswrapper[4882]: E1002 16:39:00.284109 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87\": container with ID starting with e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87 not found: ID does not exist" containerID="e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.284153 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87"} err="failed to get container status \"e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87\": rpc error: code = NotFound desc = could not find container \"e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87\": container with ID starting with e360b995c37dee2e3b006f25a174814fb2ca160087c277f376361368eff5bb87 not found: ID does not exist" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.284189 4882 scope.go:117] "RemoveContainer" containerID="b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887" Oct 02 16:39:00 crc kubenswrapper[4882]: E1002 16:39:00.284473 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887\": container with ID starting with b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887 not found: ID does not exist" containerID="b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.284503 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887"} err="failed to get container status \"b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887\": rpc error: code = NotFound desc = could not find container \"b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887\": container with ID starting with b55e7cc324f9fd9ae9fb5f6ddf7ac97de612a2f48accbfe1e7248c6bbce3a887 not found: ID does not exist" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.333233 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/734b4cae-4bbc-44de-9a7a-81934cffa9be-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.475600 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.492790 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.511186 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:00 crc kubenswrapper[4882]: E1002 16:39:00.511752 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="sg-core" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.511774 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="sg-core" Oct 02 16:39:00 crc kubenswrapper[4882]: E1002 16:39:00.511795 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="ceilometer-notification-agent" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.511803 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="ceilometer-notification-agent" Oct 02 16:39:00 crc kubenswrapper[4882]: E1002 16:39:00.511820 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="proxy-httpd" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.511828 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="proxy-httpd" Oct 02 16:39:00 crc kubenswrapper[4882]: E1002 16:39:00.511849 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="ceilometer-central-agent" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.511856 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="ceilometer-central-agent" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.512071 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="ceilometer-central-agent" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.512088 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="proxy-httpd" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.512107 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="sg-core" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.512124 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" containerName="ceilometer-notification-agent" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.514319 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.518619 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.519315 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.519349 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.579048 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.579362 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerName="glance-log" containerID="cri-o://52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c" gracePeriod=30 Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.579481 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerName="glance-httpd" containerID="cri-o://155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9" gracePeriod=30 Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.638471 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h948\" (UniqueName: \"kubernetes.io/projected/aaa038c8-281e-48cc-b855-a0eacdf9579e-kube-api-access-6h948\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.638521 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.638559 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-log-httpd\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.638574 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.638713 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-config-data\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.638942 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-scripts\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.639102 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-run-httpd\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.740807 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-scripts\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.740888 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-run-httpd\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.740962 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h948\" (UniqueName: \"kubernetes.io/projected/aaa038c8-281e-48cc-b855-a0eacdf9579e-kube-api-access-6h948\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.740984 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.741013 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.741028 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-log-httpd\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.741048 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-config-data\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.742318 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-run-httpd\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.742609 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-log-httpd\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.745345 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-scripts\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.745802 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.746510 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.747457 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-config-data\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.763097 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h948\" (UniqueName: \"kubernetes.io/projected/aaa038c8-281e-48cc-b855-a0eacdf9579e-kube-api-access-6h948\") pod \"ceilometer-0\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " pod="openstack/ceilometer-0" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.778719 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="734b4cae-4bbc-44de-9a7a-81934cffa9be" path="/var/lib/kubelet/pods/734b4cae-4bbc-44de-9a7a-81934cffa9be/volumes" Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.780740 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:00 crc kubenswrapper[4882]: I1002 16:39:00.781442 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.137088 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f6d6450-e645-4a76-b0cb-76567cf1307c","Type":"ContainerDied","Data":"52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c"} Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.137027 4882 generic.go:334] "Generic (PLEG): container finished" podID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerID="52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c" exitCode=143 Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.262689 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.484948 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4z7p2" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.559769 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdtd5\" (UniqueName: \"kubernetes.io/projected/3c0e5b41-6a96-453c-a2e1-b69c89186d5b-kube-api-access-mdtd5\") pod \"3c0e5b41-6a96-453c-a2e1-b69c89186d5b\" (UID: \"3c0e5b41-6a96-453c-a2e1-b69c89186d5b\") " Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.569427 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0e5b41-6a96-453c-a2e1-b69c89186d5b-kube-api-access-mdtd5" (OuterVolumeSpecName: "kube-api-access-mdtd5") pod "3c0e5b41-6a96-453c-a2e1-b69c89186d5b" (UID: "3c0e5b41-6a96-453c-a2e1-b69c89186d5b"). InnerVolumeSpecName "kube-api-access-mdtd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.610166 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hqqxc" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.619061 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gv7hs" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.662170 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdtd5\" (UniqueName: \"kubernetes.io/projected/3c0e5b41-6a96-453c-a2e1-b69c89186d5b-kube-api-access-mdtd5\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.763491 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh55l\" (UniqueName: \"kubernetes.io/projected/ff35d0a9-776d-4b61-8714-c5632ef1f4a6-kube-api-access-xh55l\") pod \"ff35d0a9-776d-4b61-8714-c5632ef1f4a6\" (UID: \"ff35d0a9-776d-4b61-8714-c5632ef1f4a6\") " Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.763787 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxnzc\" (UniqueName: \"kubernetes.io/projected/e6c754b5-9feb-467d-97e9-3ff17db97760-kube-api-access-zxnzc\") pod \"e6c754b5-9feb-467d-97e9-3ff17db97760\" (UID: \"e6c754b5-9feb-467d-97e9-3ff17db97760\") " Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.770425 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff35d0a9-776d-4b61-8714-c5632ef1f4a6-kube-api-access-xh55l" (OuterVolumeSpecName: "kube-api-access-xh55l") pod "ff35d0a9-776d-4b61-8714-c5632ef1f4a6" (UID: "ff35d0a9-776d-4b61-8714-c5632ef1f4a6"). InnerVolumeSpecName "kube-api-access-xh55l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.770584 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c754b5-9feb-467d-97e9-3ff17db97760-kube-api-access-zxnzc" (OuterVolumeSpecName: "kube-api-access-zxnzc") pod "e6c754b5-9feb-467d-97e9-3ff17db97760" (UID: "e6c754b5-9feb-467d-97e9-3ff17db97760"). InnerVolumeSpecName "kube-api-access-zxnzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.866084 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxnzc\" (UniqueName: \"kubernetes.io/projected/e6c754b5-9feb-467d-97e9-3ff17db97760-kube-api-access-zxnzc\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:01 crc kubenswrapper[4882]: I1002 16:39:01.866351 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh55l\" (UniqueName: \"kubernetes.io/projected/ff35d0a9-776d-4b61-8714-c5632ef1f4a6-kube-api-access-xh55l\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.152418 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerStarted","Data":"cff8ee0a7b83cd6ae29582c717812b49e030689c4070ca52d979cf3f9978dd63"} Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.155700 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4z7p2" Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.157425 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4z7p2" event={"ID":"3c0e5b41-6a96-453c-a2e1-b69c89186d5b","Type":"ContainerDied","Data":"d29e20f5d30cc30a3adc9bcd7e5f582f36483046915869f92db1fbc71e49b22a"} Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.157475 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d29e20f5d30cc30a3adc9bcd7e5f582f36483046915869f92db1fbc71e49b22a" Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.160719 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gv7hs" Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.160958 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gv7hs" event={"ID":"e6c754b5-9feb-467d-97e9-3ff17db97760","Type":"ContainerDied","Data":"19198acfa4dc5d08c0c749bc6a79c11ffe1dea6ac12a42f961f0cf8039ed5e85"} Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.161107 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19198acfa4dc5d08c0c749bc6a79c11ffe1dea6ac12a42f961f0cf8039ed5e85" Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.168810 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hqqxc" event={"ID":"ff35d0a9-776d-4b61-8714-c5632ef1f4a6","Type":"ContainerDied","Data":"e5d918fcdadddfabf7d1db45c066b27f6f80524dac0fffa5c5d09e4a63794aa3"} Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.168863 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5d918fcdadddfabf7d1db45c066b27f6f80524dac0fffa5c5d09e4a63794aa3" Oct 02 16:39:02 crc kubenswrapper[4882]: I1002 16:39:02.168935 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hqqxc" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.197327 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.198048 4882 generic.go:334] "Generic (PLEG): container finished" podID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerID="155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9" exitCode=0 Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.198145 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f6d6450-e645-4a76-b0cb-76567cf1307c","Type":"ContainerDied","Data":"155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9"} Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.198195 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f6d6450-e645-4a76-b0cb-76567cf1307c","Type":"ContainerDied","Data":"f9a7ebb617c7f2f7b177a0be82c00ec77d2d5649a02b835ac9a283f976933827"} Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.198238 4882 scope.go:117] "RemoveContainer" containerID="155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.202861 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerStarted","Data":"f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1"} Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.254627 4882 scope.go:117] "RemoveContainer" containerID="52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.298732 4882 scope.go:117] "RemoveContainer" containerID="155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9" Oct 02 16:39:04 crc kubenswrapper[4882]: E1002 16:39:04.306402 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9\": container with ID starting with 155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9 not found: ID does not exist" containerID="155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.306463 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9"} err="failed to get container status \"155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9\": rpc error: code = NotFound desc = could not find container \"155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9\": container with ID starting with 155c00065bbd42d2725cf14329da8491860f248c49b81b9708675e88e99820f9 not found: ID does not exist" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.306494 4882 scope.go:117] "RemoveContainer" containerID="52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c" Oct 02 16:39:04 crc kubenswrapper[4882]: E1002 16:39:04.315415 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c\": container with ID starting with 52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c not found: ID does not exist" containerID="52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.315483 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c"} err="failed to get container status \"52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c\": rpc error: code = NotFound desc = could not find container \"52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c\": container with ID starting with 52c397acb2dab2a23de23655b3f6cf1094036ef8be921d87755f3d654b346a5c not found: ID does not exist" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.319823 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-logs\") pod \"9f6d6450-e645-4a76-b0cb-76567cf1307c\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.319914 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-httpd-run\") pod \"9f6d6450-e645-4a76-b0cb-76567cf1307c\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.319956 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-combined-ca-bundle\") pod \"9f6d6450-e645-4a76-b0cb-76567cf1307c\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.319978 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-config-data\") pod \"9f6d6450-e645-4a76-b0cb-76567cf1307c\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.320051 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-public-tls-certs\") pod \"9f6d6450-e645-4a76-b0cb-76567cf1307c\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.320099 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9f6d6450-e645-4a76-b0cb-76567cf1307c\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.320133 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmrm6\" (UniqueName: \"kubernetes.io/projected/9f6d6450-e645-4a76-b0cb-76567cf1307c-kube-api-access-hmrm6\") pod \"9f6d6450-e645-4a76-b0cb-76567cf1307c\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.320155 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-scripts\") pod \"9f6d6450-e645-4a76-b0cb-76567cf1307c\" (UID: \"9f6d6450-e645-4a76-b0cb-76567cf1307c\") " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.321754 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-logs" (OuterVolumeSpecName: "logs") pod "9f6d6450-e645-4a76-b0cb-76567cf1307c" (UID: "9f6d6450-e645-4a76-b0cb-76567cf1307c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.322150 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f6d6450-e645-4a76-b0cb-76567cf1307c" (UID: "9f6d6450-e645-4a76-b0cb-76567cf1307c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.332367 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-scripts" (OuterVolumeSpecName: "scripts") pod "9f6d6450-e645-4a76-b0cb-76567cf1307c" (UID: "9f6d6450-e645-4a76-b0cb-76567cf1307c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.339498 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "9f6d6450-e645-4a76-b0cb-76567cf1307c" (UID: "9f6d6450-e645-4a76-b0cb-76567cf1307c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.346490 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6d6450-e645-4a76-b0cb-76567cf1307c-kube-api-access-hmrm6" (OuterVolumeSpecName: "kube-api-access-hmrm6") pod "9f6d6450-e645-4a76-b0cb-76567cf1307c" (UID: "9f6d6450-e645-4a76-b0cb-76567cf1307c"). InnerVolumeSpecName "kube-api-access-hmrm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.386028 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f6d6450-e645-4a76-b0cb-76567cf1307c" (UID: "9f6d6450-e645-4a76-b0cb-76567cf1307c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.422564 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.422618 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.422632 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmrm6\" (UniqueName: \"kubernetes.io/projected/9f6d6450-e645-4a76-b0cb-76567cf1307c-kube-api-access-hmrm6\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.422645 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.422657 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.422668 4882 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f6d6450-e645-4a76-b0cb-76567cf1307c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.435302 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-config-data" (OuterVolumeSpecName: "config-data") pod "9f6d6450-e645-4a76-b0cb-76567cf1307c" (UID: "9f6d6450-e645-4a76-b0cb-76567cf1307c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.454983 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.475398 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9f6d6450-e645-4a76-b0cb-76567cf1307c" (UID: "9f6d6450-e645-4a76-b0cb-76567cf1307c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.525118 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.525176 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6d6450-e645-4a76-b0cb-76567cf1307c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:04 crc kubenswrapper[4882]: I1002 16:39:04.525192 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.214280 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.219232 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerStarted","Data":"48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153"} Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.219298 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerStarted","Data":"069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671"} Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.239459 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.249440 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.249780 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerName="glance-log" containerID="cri-o://aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5" gracePeriod=30 Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.249827 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerName="glance-httpd" containerID="cri-o://3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd" gracePeriod=30 Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.267333 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.283586 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:39:05 crc kubenswrapper[4882]: E1002 16:39:05.284307 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0e5b41-6a96-453c-a2e1-b69c89186d5b" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.284408 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0e5b41-6a96-453c-a2e1-b69c89186d5b" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: E1002 16:39:05.284518 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerName="glance-httpd" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.284592 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerName="glance-httpd" Oct 02 16:39:05 crc kubenswrapper[4882]: E1002 16:39:05.284697 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerName="glance-log" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.284795 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerName="glance-log" Oct 02 16:39:05 crc kubenswrapper[4882]: E1002 16:39:05.284873 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c754b5-9feb-467d-97e9-3ff17db97760" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.284948 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c754b5-9feb-467d-97e9-3ff17db97760" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: E1002 16:39:05.285038 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff35d0a9-776d-4b61-8714-c5632ef1f4a6" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.285114 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff35d0a9-776d-4b61-8714-c5632ef1f4a6" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.285577 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerName="glance-log" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.285655 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" containerName="glance-httpd" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.285680 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c754b5-9feb-467d-97e9-3ff17db97760" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.285700 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff35d0a9-776d-4b61-8714-c5632ef1f4a6" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.285733 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0e5b41-6a96-453c-a2e1-b69c89186d5b" containerName="mariadb-database-create" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.287632 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.295748 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.296585 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.311110 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.339726 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.339785 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-logs\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.340099 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.340198 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.340277 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.340340 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.340365 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv9h9\" (UniqueName: \"kubernetes.io/projected/e0de09a9-9a37-4c03-abd4-002230d4f583-kube-api-access-gv9h9\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.340455 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.441948 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.442003 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-logs\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.442081 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.442120 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.442148 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.442182 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.442207 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv9h9\" (UniqueName: \"kubernetes.io/projected/e0de09a9-9a37-4c03-abd4-002230d4f583-kube-api-access-gv9h9\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.442265 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.443286 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-logs\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.443551 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.443756 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.446633 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.448010 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.448317 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.453001 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.463605 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv9h9\" (UniqueName: \"kubernetes.io/projected/e0de09a9-9a37-4c03-abd4-002230d4f583-kube-api-access-gv9h9\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.481025 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " pod="openstack/glance-default-external-api-0" Oct 02 16:39:05 crc kubenswrapper[4882]: I1002 16:39:05.614282 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:39:06 crc kubenswrapper[4882]: I1002 16:39:06.141361 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:39:06 crc kubenswrapper[4882]: I1002 16:39:06.243110 4882 generic.go:334] "Generic (PLEG): container finished" podID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerID="aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5" exitCode=143 Oct 02 16:39:06 crc kubenswrapper[4882]: I1002 16:39:06.243166 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65607d96-f458-46ff-b0e9-b4a3cd818657","Type":"ContainerDied","Data":"aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5"} Oct 02 16:39:06 crc kubenswrapper[4882]: I1002 16:39:06.246240 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0de09a9-9a37-4c03-abd4-002230d4f583","Type":"ContainerStarted","Data":"60672cd4904b96bbaea11c03ffc6269842a669ec325af40afd317152dd8f413a"} Oct 02 16:39:06 crc kubenswrapper[4882]: I1002 16:39:06.783081 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6d6450-e645-4a76-b0cb-76567cf1307c" path="/var/lib/kubelet/pods/9f6d6450-e645-4a76-b0cb-76567cf1307c/volumes" Oct 02 16:39:07 crc kubenswrapper[4882]: I1002 16:39:07.260591 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerStarted","Data":"0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa"} Oct 02 16:39:07 crc kubenswrapper[4882]: I1002 16:39:07.260867 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="ceilometer-central-agent" containerID="cri-o://f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1" gracePeriod=30 Oct 02 16:39:07 crc kubenswrapper[4882]: I1002 16:39:07.261278 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 16:39:07 crc kubenswrapper[4882]: I1002 16:39:07.261626 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="proxy-httpd" containerID="cri-o://0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa" gracePeriod=30 Oct 02 16:39:07 crc kubenswrapper[4882]: I1002 16:39:07.261699 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="sg-core" containerID="cri-o://48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153" gracePeriod=30 Oct 02 16:39:07 crc kubenswrapper[4882]: I1002 16:39:07.261744 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="ceilometer-notification-agent" containerID="cri-o://069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671" gracePeriod=30 Oct 02 16:39:07 crc kubenswrapper[4882]: I1002 16:39:07.267435 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0de09a9-9a37-4c03-abd4-002230d4f583","Type":"ContainerStarted","Data":"12b272e78e66a0a2addd30c1a027807a59f148b85e71c5a2b8d9836fe9b8d843"} Oct 02 16:39:07 crc kubenswrapper[4882]: I1002 16:39:07.293713 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.473084586 podStartE2EDuration="7.293688471s" podCreationTimestamp="2025-10-02 16:39:00 +0000 UTC" firstStartedPulling="2025-10-02 16:39:01.275894782 +0000 UTC m=+1300.025124319" lastFinishedPulling="2025-10-02 16:39:06.096498677 +0000 UTC m=+1304.845728204" observedRunningTime="2025-10-02 16:39:07.284393587 +0000 UTC m=+1306.033623114" watchObservedRunningTime="2025-10-02 16:39:07.293688471 +0000 UTC m=+1306.042917998" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.286079 4882 generic.go:334] "Generic (PLEG): container finished" podID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerID="0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa" exitCode=0 Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.286417 4882 generic.go:334] "Generic (PLEG): container finished" podID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerID="48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153" exitCode=2 Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.286428 4882 generic.go:334] "Generic (PLEG): container finished" podID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerID="069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671" exitCode=0 Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.286257 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerDied","Data":"0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa"} Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.286496 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerDied","Data":"48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153"} Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.286510 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerDied","Data":"069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671"} Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.288900 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0de09a9-9a37-4c03-abd4-002230d4f583","Type":"ContainerStarted","Data":"e93df0f0131f30a72dba182ec0f2f33b1dc2c6158f2a976f2348f6a3c71cbaaf"} Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.328586 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.328564391 podStartE2EDuration="3.328564391s" podCreationTimestamp="2025-10-02 16:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:39:08.306520745 +0000 UTC m=+1307.055750272" watchObservedRunningTime="2025-10-02 16:39:08.328564391 +0000 UTC m=+1307.077793918" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.709014 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e3bb-account-create-dlpct"] Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.710936 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e3bb-account-create-dlpct" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.726301 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.738355 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e3bb-account-create-dlpct"] Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.828904 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfs2x\" (UniqueName: \"kubernetes.io/projected/843b17cb-e35a-455c-94c4-7afe31780c91-kube-api-access-xfs2x\") pod \"nova-api-e3bb-account-create-dlpct\" (UID: \"843b17cb-e35a-455c-94c4-7afe31780c91\") " pod="openstack/nova-api-e3bb-account-create-dlpct" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.895499 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-50c5-account-create-25qqs"] Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.899381 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-50c5-account-create-25qqs" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.901368 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.907954 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-50c5-account-create-25qqs"] Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.926179 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.930047 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfs2x\" (UniqueName: \"kubernetes.io/projected/843b17cb-e35a-455c-94c4-7afe31780c91-kube-api-access-xfs2x\") pod \"nova-api-e3bb-account-create-dlpct\" (UID: \"843b17cb-e35a-455c-94c4-7afe31780c91\") " pod="openstack/nova-api-e3bb-account-create-dlpct" Oct 02 16:39:08 crc kubenswrapper[4882]: I1002 16:39:08.965574 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfs2x\" (UniqueName: \"kubernetes.io/projected/843b17cb-e35a-455c-94c4-7afe31780c91-kube-api-access-xfs2x\") pod \"nova-api-e3bb-account-create-dlpct\" (UID: \"843b17cb-e35a-455c-94c4-7afe31780c91\") " pod="openstack/nova-api-e3bb-account-create-dlpct" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:08.997697 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-56bd-account-create-bkfmj"] Oct 02 16:39:09 crc kubenswrapper[4882]: E1002 16:39:08.998114 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerName="glance-httpd" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:08.998127 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerName="glance-httpd" Oct 02 16:39:09 crc kubenswrapper[4882]: E1002 16:39:08.998161 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerName="glance-log" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:08.998166 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerName="glance-log" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:08.998335 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerName="glance-httpd" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:08.998354 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerName="glance-log" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:08.998909 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-56bd-account-create-bkfmj" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.000984 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.020277 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-56bd-account-create-bkfmj"] Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.030991 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"65607d96-f458-46ff-b0e9-b4a3cd818657\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.031056 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-config-data\") pod \"65607d96-f458-46ff-b0e9-b4a3cd818657\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.031088 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tt48\" (UniqueName: \"kubernetes.io/projected/65607d96-f458-46ff-b0e9-b4a3cd818657-kube-api-access-4tt48\") pod \"65607d96-f458-46ff-b0e9-b4a3cd818657\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.031139 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-scripts\") pod \"65607d96-f458-46ff-b0e9-b4a3cd818657\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.031275 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-logs\") pod \"65607d96-f458-46ff-b0e9-b4a3cd818657\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.031316 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-internal-tls-certs\") pod \"65607d96-f458-46ff-b0e9-b4a3cd818657\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.031346 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-combined-ca-bundle\") pod \"65607d96-f458-46ff-b0e9-b4a3cd818657\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.031420 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-httpd-run\") pod \"65607d96-f458-46ff-b0e9-b4a3cd818657\" (UID: \"65607d96-f458-46ff-b0e9-b4a3cd818657\") " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.031814 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5wm\" (UniqueName: \"kubernetes.io/projected/738df5a4-f251-460f-9d47-51a4b20967ad-kube-api-access-tg5wm\") pod \"nova-cell0-50c5-account-create-25qqs\" (UID: \"738df5a4-f251-460f-9d47-51a4b20967ad\") " pod="openstack/nova-cell0-50c5-account-create-25qqs" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.034285 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "65607d96-f458-46ff-b0e9-b4a3cd818657" (UID: "65607d96-f458-46ff-b0e9-b4a3cd818657"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.034399 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-logs" (OuterVolumeSpecName: "logs") pod "65607d96-f458-46ff-b0e9-b4a3cd818657" (UID: "65607d96-f458-46ff-b0e9-b4a3cd818657"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.042194 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-scripts" (OuterVolumeSpecName: "scripts") pod "65607d96-f458-46ff-b0e9-b4a3cd818657" (UID: "65607d96-f458-46ff-b0e9-b4a3cd818657"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.042275 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65607d96-f458-46ff-b0e9-b4a3cd818657-kube-api-access-4tt48" (OuterVolumeSpecName: "kube-api-access-4tt48") pod "65607d96-f458-46ff-b0e9-b4a3cd818657" (UID: "65607d96-f458-46ff-b0e9-b4a3cd818657"). InnerVolumeSpecName "kube-api-access-4tt48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.042592 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e3bb-account-create-dlpct" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.049422 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "65607d96-f458-46ff-b0e9-b4a3cd818657" (UID: "65607d96-f458-46ff-b0e9-b4a3cd818657"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.088464 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65607d96-f458-46ff-b0e9-b4a3cd818657" (UID: "65607d96-f458-46ff-b0e9-b4a3cd818657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.113897 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-config-data" (OuterVolumeSpecName: "config-data") pod "65607d96-f458-46ff-b0e9-b4a3cd818657" (UID: "65607d96-f458-46ff-b0e9-b4a3cd818657"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.117727 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "65607d96-f458-46ff-b0e9-b4a3cd818657" (UID: "65607d96-f458-46ff-b0e9-b4a3cd818657"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133473 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qtb5\" (UniqueName: \"kubernetes.io/projected/c62e6f10-05a5-4e97-ae04-3d312702455d-kube-api-access-6qtb5\") pod \"nova-cell1-56bd-account-create-bkfmj\" (UID: \"c62e6f10-05a5-4e97-ae04-3d312702455d\") " pod="openstack/nova-cell1-56bd-account-create-bkfmj" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133557 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg5wm\" (UniqueName: \"kubernetes.io/projected/738df5a4-f251-460f-9d47-51a4b20967ad-kube-api-access-tg5wm\") pod \"nova-cell0-50c5-account-create-25qqs\" (UID: \"738df5a4-f251-460f-9d47-51a4b20967ad\") " pod="openstack/nova-cell0-50c5-account-create-25qqs" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133619 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133630 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133640 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133650 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133659 4882 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65607d96-f458-46ff-b0e9-b4a3cd818657-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133709 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133721 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65607d96-f458-46ff-b0e9-b4a3cd818657-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.133730 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tt48\" (UniqueName: \"kubernetes.io/projected/65607d96-f458-46ff-b0e9-b4a3cd818657-kube-api-access-4tt48\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.153581 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg5wm\" (UniqueName: \"kubernetes.io/projected/738df5a4-f251-460f-9d47-51a4b20967ad-kube-api-access-tg5wm\") pod \"nova-cell0-50c5-account-create-25qqs\" (UID: \"738df5a4-f251-460f-9d47-51a4b20967ad\") " pod="openstack/nova-cell0-50c5-account-create-25qqs" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.154194 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.237520 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qtb5\" (UniqueName: \"kubernetes.io/projected/c62e6f10-05a5-4e97-ae04-3d312702455d-kube-api-access-6qtb5\") pod \"nova-cell1-56bd-account-create-bkfmj\" (UID: \"c62e6f10-05a5-4e97-ae04-3d312702455d\") " pod="openstack/nova-cell1-56bd-account-create-bkfmj" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.238049 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.242807 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-50c5-account-create-25qqs" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.254434 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qtb5\" (UniqueName: \"kubernetes.io/projected/c62e6f10-05a5-4e97-ae04-3d312702455d-kube-api-access-6qtb5\") pod \"nova-cell1-56bd-account-create-bkfmj\" (UID: \"c62e6f10-05a5-4e97-ae04-3d312702455d\") " pod="openstack/nova-cell1-56bd-account-create-bkfmj" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.301855 4882 generic.go:334] "Generic (PLEG): container finished" podID="65607d96-f458-46ff-b0e9-b4a3cd818657" containerID="3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd" exitCode=0 Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.301946 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.301952 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65607d96-f458-46ff-b0e9-b4a3cd818657","Type":"ContainerDied","Data":"3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd"} Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.302087 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65607d96-f458-46ff-b0e9-b4a3cd818657","Type":"ContainerDied","Data":"8295b8fdf13817801a83e0e3949a38d9ae4397f892a84ca55493360babdc997f"} Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.302108 4882 scope.go:117] "RemoveContainer" containerID="3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.330965 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-56bd-account-create-bkfmj" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.334428 4882 scope.go:117] "RemoveContainer" containerID="aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.345661 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.361994 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.374465 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.378203 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.380273 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.382358 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.382538 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.443437 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.443499 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25bt7\" (UniqueName: \"kubernetes.io/projected/e8cf6351-a2e4-475f-a9f2-9006fee40049-kube-api-access-25bt7\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.443583 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.443604 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.443627 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.443674 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.443713 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.443761 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.446501 4882 scope.go:117] "RemoveContainer" containerID="3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd" Oct 02 16:39:09 crc kubenswrapper[4882]: E1002 16:39:09.447778 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd\": container with ID starting with 3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd not found: ID does not exist" containerID="3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.447815 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd"} err="failed to get container status \"3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd\": rpc error: code = NotFound desc = could not find container \"3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd\": container with ID starting with 3c9d4307f380b6ed74043ed9f2d2d1ce2fa100e61c632ce67bf2c1879267dcdd not found: ID does not exist" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.447843 4882 scope.go:117] "RemoveContainer" containerID="aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5" Oct 02 16:39:09 crc kubenswrapper[4882]: E1002 16:39:09.448269 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5\": container with ID starting with aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5 not found: ID does not exist" containerID="aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.448290 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5"} err="failed to get container status \"aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5\": rpc error: code = NotFound desc = could not find container \"aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5\": container with ID starting with aea2b2d37246ad5fb221e86a1993a41d250eb17cb7955a2a0ccace76226b83e5 not found: ID does not exist" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.547663 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548011 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25bt7\" (UniqueName: \"kubernetes.io/projected/e8cf6351-a2e4-475f-a9f2-9006fee40049-kube-api-access-25bt7\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548086 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548111 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548133 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548145 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548314 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548351 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548414 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548518 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.548853 4882 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.553643 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.553808 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.556889 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e3bb-account-create-dlpct"] Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.559096 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.579755 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.603541 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25bt7\" (UniqueName: \"kubernetes.io/projected/e8cf6351-a2e4-475f-a9f2-9006fee40049-kube-api-access-25bt7\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.613382 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.730479 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-50c5-account-create-25qqs"] Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.731190 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:09 crc kubenswrapper[4882]: I1002 16:39:09.868480 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-56bd-account-create-bkfmj"] Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.300446 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:39:10 crc kubenswrapper[4882]: W1002 16:39:10.300506 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8cf6351_a2e4_475f_a9f2_9006fee40049.slice/crio-992c81299a04e5daff196308e63e462f970c84fba7f54d8f4c13a48c41b6e8cc WatchSource:0}: Error finding container 992c81299a04e5daff196308e63e462f970c84fba7f54d8f4c13a48c41b6e8cc: Status 404 returned error can't find the container with id 992c81299a04e5daff196308e63e462f970c84fba7f54d8f4c13a48c41b6e8cc Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.313832 4882 generic.go:334] "Generic (PLEG): container finished" podID="c62e6f10-05a5-4e97-ae04-3d312702455d" containerID="60dd8ce85cbf8ece34f51a408ff617b2287ea1992d2c01d977fc1479f2c461eb" exitCode=0 Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.313954 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-56bd-account-create-bkfmj" event={"ID":"c62e6f10-05a5-4e97-ae04-3d312702455d","Type":"ContainerDied","Data":"60dd8ce85cbf8ece34f51a408ff617b2287ea1992d2c01d977fc1479f2c461eb"} Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.313991 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-56bd-account-create-bkfmj" event={"ID":"c62e6f10-05a5-4e97-ae04-3d312702455d","Type":"ContainerStarted","Data":"76039b9f71dba210b14b6f712dd730de1ebac1ce9737feb57a860fca8ef0f811"} Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.319737 4882 generic.go:334] "Generic (PLEG): container finished" podID="843b17cb-e35a-455c-94c4-7afe31780c91" containerID="e79f0308f09fd876708535b015dec7df2adfc0310031d690c5a22486f2f96d11" exitCode=0 Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.319806 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e3bb-account-create-dlpct" event={"ID":"843b17cb-e35a-455c-94c4-7afe31780c91","Type":"ContainerDied","Data":"e79f0308f09fd876708535b015dec7df2adfc0310031d690c5a22486f2f96d11"} Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.319836 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e3bb-account-create-dlpct" event={"ID":"843b17cb-e35a-455c-94c4-7afe31780c91","Type":"ContainerStarted","Data":"aeb36a043f912db1537bdfaa0958b0391fb079733b536761b09b41fb144d347a"} Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.322592 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8cf6351-a2e4-475f-a9f2-9006fee40049","Type":"ContainerStarted","Data":"992c81299a04e5daff196308e63e462f970c84fba7f54d8f4c13a48c41b6e8cc"} Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.339557 4882 generic.go:334] "Generic (PLEG): container finished" podID="738df5a4-f251-460f-9d47-51a4b20967ad" containerID="2581f184636b48853f0d00e8ec8da4e2bfda90cb24a614d3b7f1a2a83bb228d5" exitCode=0 Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.339610 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-50c5-account-create-25qqs" event={"ID":"738df5a4-f251-460f-9d47-51a4b20967ad","Type":"ContainerDied","Data":"2581f184636b48853f0d00e8ec8da4e2bfda90cb24a614d3b7f1a2a83bb228d5"} Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.339638 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-50c5-account-create-25qqs" event={"ID":"738df5a4-f251-460f-9d47-51a4b20967ad","Type":"ContainerStarted","Data":"eac9c20e599578267ec3115f6699714135421b18bddd311f9faec513ffe5662f"} Oct 02 16:39:10 crc kubenswrapper[4882]: I1002 16:39:10.780489 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65607d96-f458-46ff-b0e9-b4a3cd818657" path="/var/lib/kubelet/pods/65607d96-f458-46ff-b0e9-b4a3cd818657/volumes" Oct 02 16:39:11 crc kubenswrapper[4882]: I1002 16:39:11.354878 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8cf6351-a2e4-475f-a9f2-9006fee40049","Type":"ContainerStarted","Data":"2d5462c744d32c02349363789c3e1d2d942136eca10c547d25ef94b8fbf2a4c0"} Oct 02 16:39:11 crc kubenswrapper[4882]: I1002 16:39:11.934448 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e3bb-account-create-dlpct" Oct 02 16:39:11 crc kubenswrapper[4882]: I1002 16:39:11.943079 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-50c5-account-create-25qqs" Oct 02 16:39:11 crc kubenswrapper[4882]: I1002 16:39:11.949984 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-56bd-account-create-bkfmj" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.003523 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qtb5\" (UniqueName: \"kubernetes.io/projected/c62e6f10-05a5-4e97-ae04-3d312702455d-kube-api-access-6qtb5\") pod \"c62e6f10-05a5-4e97-ae04-3d312702455d\" (UID: \"c62e6f10-05a5-4e97-ae04-3d312702455d\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.003623 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg5wm\" (UniqueName: \"kubernetes.io/projected/738df5a4-f251-460f-9d47-51a4b20967ad-kube-api-access-tg5wm\") pod \"738df5a4-f251-460f-9d47-51a4b20967ad\" (UID: \"738df5a4-f251-460f-9d47-51a4b20967ad\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.003800 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfs2x\" (UniqueName: \"kubernetes.io/projected/843b17cb-e35a-455c-94c4-7afe31780c91-kube-api-access-xfs2x\") pod \"843b17cb-e35a-455c-94c4-7afe31780c91\" (UID: \"843b17cb-e35a-455c-94c4-7afe31780c91\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.011647 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843b17cb-e35a-455c-94c4-7afe31780c91-kube-api-access-xfs2x" (OuterVolumeSpecName: "kube-api-access-xfs2x") pod "843b17cb-e35a-455c-94c4-7afe31780c91" (UID: "843b17cb-e35a-455c-94c4-7afe31780c91"). InnerVolumeSpecName "kube-api-access-xfs2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.011795 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738df5a4-f251-460f-9d47-51a4b20967ad-kube-api-access-tg5wm" (OuterVolumeSpecName: "kube-api-access-tg5wm") pod "738df5a4-f251-460f-9d47-51a4b20967ad" (UID: "738df5a4-f251-460f-9d47-51a4b20967ad"). InnerVolumeSpecName "kube-api-access-tg5wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.016014 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c62e6f10-05a5-4e97-ae04-3d312702455d-kube-api-access-6qtb5" (OuterVolumeSpecName: "kube-api-access-6qtb5") pod "c62e6f10-05a5-4e97-ae04-3d312702455d" (UID: "c62e6f10-05a5-4e97-ae04-3d312702455d"). InnerVolumeSpecName "kube-api-access-6qtb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.105116 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qtb5\" (UniqueName: \"kubernetes.io/projected/c62e6f10-05a5-4e97-ae04-3d312702455d-kube-api-access-6qtb5\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.105161 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg5wm\" (UniqueName: \"kubernetes.io/projected/738df5a4-f251-460f-9d47-51a4b20967ad-kube-api-access-tg5wm\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.105172 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfs2x\" (UniqueName: \"kubernetes.io/projected/843b17cb-e35a-455c-94c4-7afe31780c91-kube-api-access-xfs2x\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.178160 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206228 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h948\" (UniqueName: \"kubernetes.io/projected/aaa038c8-281e-48cc-b855-a0eacdf9579e-kube-api-access-6h948\") pod \"aaa038c8-281e-48cc-b855-a0eacdf9579e\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206318 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-scripts\") pod \"aaa038c8-281e-48cc-b855-a0eacdf9579e\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206398 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-log-httpd\") pod \"aaa038c8-281e-48cc-b855-a0eacdf9579e\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206449 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-run-httpd\") pod \"aaa038c8-281e-48cc-b855-a0eacdf9579e\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206522 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-config-data\") pod \"aaa038c8-281e-48cc-b855-a0eacdf9579e\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206575 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-combined-ca-bundle\") pod \"aaa038c8-281e-48cc-b855-a0eacdf9579e\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206593 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-sg-core-conf-yaml\") pod \"aaa038c8-281e-48cc-b855-a0eacdf9579e\" (UID: \"aaa038c8-281e-48cc-b855-a0eacdf9579e\") " Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206847 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aaa038c8-281e-48cc-b855-a0eacdf9579e" (UID: "aaa038c8-281e-48cc-b855-a0eacdf9579e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206964 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aaa038c8-281e-48cc-b855-a0eacdf9579e" (UID: "aaa038c8-281e-48cc-b855-a0eacdf9579e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.206996 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.209904 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa038c8-281e-48cc-b855-a0eacdf9579e-kube-api-access-6h948" (OuterVolumeSpecName: "kube-api-access-6h948") pod "aaa038c8-281e-48cc-b855-a0eacdf9579e" (UID: "aaa038c8-281e-48cc-b855-a0eacdf9579e"). InnerVolumeSpecName "kube-api-access-6h948". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.209921 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-scripts" (OuterVolumeSpecName: "scripts") pod "aaa038c8-281e-48cc-b855-a0eacdf9579e" (UID: "aaa038c8-281e-48cc-b855-a0eacdf9579e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.235015 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aaa038c8-281e-48cc-b855-a0eacdf9579e" (UID: "aaa038c8-281e-48cc-b855-a0eacdf9579e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.274149 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaa038c8-281e-48cc-b855-a0eacdf9579e" (UID: "aaa038c8-281e-48cc-b855-a0eacdf9579e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.303758 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-config-data" (OuterVolumeSpecName: "config-data") pod "aaa038c8-281e-48cc-b855-a0eacdf9579e" (UID: "aaa038c8-281e-48cc-b855-a0eacdf9579e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.308318 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h948\" (UniqueName: \"kubernetes.io/projected/aaa038c8-281e-48cc-b855-a0eacdf9579e-kube-api-access-6h948\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.308440 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.308505 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaa038c8-281e-48cc-b855-a0eacdf9579e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.308568 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.308626 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.308681 4882 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaa038c8-281e-48cc-b855-a0eacdf9579e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.365379 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-50c5-account-create-25qqs" event={"ID":"738df5a4-f251-460f-9d47-51a4b20967ad","Type":"ContainerDied","Data":"eac9c20e599578267ec3115f6699714135421b18bddd311f9faec513ffe5662f"} Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.365418 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-50c5-account-create-25qqs" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.365443 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eac9c20e599578267ec3115f6699714135421b18bddd311f9faec513ffe5662f" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.367888 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-56bd-account-create-bkfmj" event={"ID":"c62e6f10-05a5-4e97-ae04-3d312702455d","Type":"ContainerDied","Data":"76039b9f71dba210b14b6f712dd730de1ebac1ce9737feb57a860fca8ef0f811"} Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.367931 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76039b9f71dba210b14b6f712dd730de1ebac1ce9737feb57a860fca8ef0f811" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.368438 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-56bd-account-create-bkfmj" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.369746 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e3bb-account-create-dlpct" event={"ID":"843b17cb-e35a-455c-94c4-7afe31780c91","Type":"ContainerDied","Data":"aeb36a043f912db1537bdfaa0958b0391fb079733b536761b09b41fb144d347a"} Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.369769 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeb36a043f912db1537bdfaa0958b0391fb079733b536761b09b41fb144d347a" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.369779 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e3bb-account-create-dlpct" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.371685 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8cf6351-a2e4-475f-a9f2-9006fee40049","Type":"ContainerStarted","Data":"45bb1b9d3f14a231f70b2267ff101ad41616e2eb0158bfb01cf9db8c6d56d8ca"} Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.374166 4882 generic.go:334] "Generic (PLEG): container finished" podID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerID="f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1" exitCode=0 Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.374411 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerDied","Data":"f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1"} Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.374454 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaa038c8-281e-48cc-b855-a0eacdf9579e","Type":"ContainerDied","Data":"cff8ee0a7b83cd6ae29582c717812b49e030689c4070ca52d979cf3f9978dd63"} Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.374508 4882 scope.go:117] "RemoveContainer" containerID="0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.374937 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.402665 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.402637678 podStartE2EDuration="3.402637678s" podCreationTimestamp="2025-10-02 16:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:39:12.392477001 +0000 UTC m=+1311.141706528" watchObservedRunningTime="2025-10-02 16:39:12.402637678 +0000 UTC m=+1311.151867205" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.421071 4882 scope.go:117] "RemoveContainer" containerID="48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.430975 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.447872 4882 scope.go:117] "RemoveContainer" containerID="069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.448048 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.461608 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.462283 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843b17cb-e35a-455c-94c4-7afe31780c91" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462325 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="843b17cb-e35a-455c-94c4-7afe31780c91" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.462343 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="proxy-httpd" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462350 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="proxy-httpd" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.462364 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="ceilometer-notification-agent" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462372 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="ceilometer-notification-agent" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.462408 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738df5a4-f251-460f-9d47-51a4b20967ad" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462417 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="738df5a4-f251-460f-9d47-51a4b20967ad" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.462433 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="ceilometer-central-agent" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462441 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="ceilometer-central-agent" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.462490 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="sg-core" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462499 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="sg-core" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.462519 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62e6f10-05a5-4e97-ae04-3d312702455d" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462527 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62e6f10-05a5-4e97-ae04-3d312702455d" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462751 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="sg-core" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462767 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="proxy-httpd" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462797 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62e6f10-05a5-4e97-ae04-3d312702455d" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462808 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="ceilometer-notification-agent" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462817 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="843b17cb-e35a-455c-94c4-7afe31780c91" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462828 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="738df5a4-f251-460f-9d47-51a4b20967ad" containerName="mariadb-account-create" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.462845 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" containerName="ceilometer-central-agent" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.464805 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.474112 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.476117 4882 scope.go:117] "RemoveContainer" containerID="f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.476908 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.481656 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.514194 4882 scope.go:117] "RemoveContainer" containerID="0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.516691 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa\": container with ID starting with 0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa not found: ID does not exist" containerID="0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.516727 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa"} err="failed to get container status \"0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa\": rpc error: code = NotFound desc = could not find container \"0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa\": container with ID starting with 0aec7da0324c875fb5a5b0be11a71a5ae51a861439401907f5cfd9a75f3b9afa not found: ID does not exist" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.516756 4882 scope.go:117] "RemoveContainer" containerID="48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.517290 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153\": container with ID starting with 48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153 not found: ID does not exist" containerID="48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.517344 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153"} err="failed to get container status \"48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153\": rpc error: code = NotFound desc = could not find container \"48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153\": container with ID starting with 48fd0fdd6b905e8fa548f61c6677442d56a5b3ac32ae5aac4dff1b6181b17153 not found: ID does not exist" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.517380 4882 scope.go:117] "RemoveContainer" containerID="069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.517677 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671\": container with ID starting with 069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671 not found: ID does not exist" containerID="069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.517727 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671"} err="failed to get container status \"069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671\": rpc error: code = NotFound desc = could not find container \"069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671\": container with ID starting with 069d56a262ea2bb84003ba78389fb93d9a8328faf1fe61e9e80eb664dc85d671 not found: ID does not exist" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.517744 4882 scope.go:117] "RemoveContainer" containerID="f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1" Oct 02 16:39:12 crc kubenswrapper[4882]: E1002 16:39:12.518277 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1\": container with ID starting with f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1 not found: ID does not exist" containerID="f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.518326 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1"} err="failed to get container status \"f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1\": rpc error: code = NotFound desc = could not find container \"f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1\": container with ID starting with f59725373b2561876502436747230afc07ccf1f7fb0a735a88a47cb35e66b5e1 not found: ID does not exist" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.615195 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-config-data\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.615357 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.615441 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-log-httpd\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.615486 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-run-httpd\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.615532 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-scripts\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.615553 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.615615 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72l72\" (UniqueName: \"kubernetes.io/projected/a122317f-f660-4914-8e39-e800c266d5a7-kube-api-access-72l72\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.716342 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.716411 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-log-httpd\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.716451 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-run-httpd\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.716492 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-scripts\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.716509 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.716540 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72l72\" (UniqueName: \"kubernetes.io/projected/a122317f-f660-4914-8e39-e800c266d5a7-kube-api-access-72l72\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.716601 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-config-data\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.716925 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-log-httpd\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.717938 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-run-httpd\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.722425 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-scripts\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.722932 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.723619 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-config-data\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.723940 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.742012 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72l72\" (UniqueName: \"kubernetes.io/projected/a122317f-f660-4914-8e39-e800c266d5a7-kube-api-access-72l72\") pod \"ceilometer-0\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " pod="openstack/ceilometer-0" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.776161 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa038c8-281e-48cc-b855-a0eacdf9579e" path="/var/lib/kubelet/pods/aaa038c8-281e-48cc-b855-a0eacdf9579e/volumes" Oct 02 16:39:12 crc kubenswrapper[4882]: I1002 16:39:12.798658 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:13 crc kubenswrapper[4882]: I1002 16:39:13.282040 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:13 crc kubenswrapper[4882]: W1002 16:39:13.293645 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda122317f_f660_4914_8e39_e800c266d5a7.slice/crio-b8ed88edcbc11c83a1cc18eeb139533b261a4bed5707a5d3fd34465c58ae7097 WatchSource:0}: Error finding container b8ed88edcbc11c83a1cc18eeb139533b261a4bed5707a5d3fd34465c58ae7097: Status 404 returned error can't find the container with id b8ed88edcbc11c83a1cc18eeb139533b261a4bed5707a5d3fd34465c58ae7097 Oct 02 16:39:13 crc kubenswrapper[4882]: I1002 16:39:13.385265 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerStarted","Data":"b8ed88edcbc11c83a1cc18eeb139533b261a4bed5707a5d3fd34465c58ae7097"} Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.163870 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lgmf6"] Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.166887 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.170206 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.171295 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dc9pp" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.186300 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.199574 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lgmf6"] Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.350718 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrmn\" (UniqueName: \"kubernetes.io/projected/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-kube-api-access-5lrmn\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.350801 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-scripts\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.351140 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-config-data\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.351234 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.400159 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerStarted","Data":"ce8b1c55ef5df8b4b009b6a602dd32bd82bc8dde5ab83a49f49cb22de0d4304b"} Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.453335 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.453466 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrmn\" (UniqueName: \"kubernetes.io/projected/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-kube-api-access-5lrmn\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.453546 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-scripts\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.453655 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-config-data\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.458451 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-scripts\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.461139 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.469334 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-config-data\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.469571 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrmn\" (UniqueName: \"kubernetes.io/projected/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-kube-api-access-5lrmn\") pod \"nova-cell0-conductor-db-sync-lgmf6\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:14 crc kubenswrapper[4882]: I1002 16:39:14.487007 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:15 crc kubenswrapper[4882]: I1002 16:39:15.006797 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lgmf6"] Oct 02 16:39:15 crc kubenswrapper[4882]: I1002 16:39:15.411935 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" event={"ID":"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb","Type":"ContainerStarted","Data":"69c4b496c7c7cc58369d9aa6a071d6efd49b847acd376cd029f6d52ab3193926"} Oct 02 16:39:15 crc kubenswrapper[4882]: I1002 16:39:15.415614 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerStarted","Data":"2875a27042dc7b2dae3ebdf95fb40961c4f6b618d3612781f994f5995dabc27f"} Oct 02 16:39:15 crc kubenswrapper[4882]: I1002 16:39:15.616400 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 16:39:15 crc kubenswrapper[4882]: I1002 16:39:15.616922 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 16:39:15 crc kubenswrapper[4882]: I1002 16:39:15.730918 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 16:39:15 crc kubenswrapper[4882]: I1002 16:39:15.765510 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 16:39:16 crc kubenswrapper[4882]: I1002 16:39:16.425821 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 16:39:16 crc kubenswrapper[4882]: I1002 16:39:16.425887 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 16:39:18 crc kubenswrapper[4882]: I1002 16:39:18.454160 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerStarted","Data":"69a3c9434abe928ced99cf2528160177cc2402d0addfa4f2c70bf4896a5c5965"} Oct 02 16:39:18 crc kubenswrapper[4882]: I1002 16:39:18.541463 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 16:39:18 crc kubenswrapper[4882]: I1002 16:39:18.541565 4882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 16:39:18 crc kubenswrapper[4882]: I1002 16:39:18.542899 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 16:39:19 crc kubenswrapper[4882]: I1002 16:39:19.732268 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:19 crc kubenswrapper[4882]: I1002 16:39:19.734774 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:19 crc kubenswrapper[4882]: I1002 16:39:19.771946 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:19 crc kubenswrapper[4882]: I1002 16:39:19.789010 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:20 crc kubenswrapper[4882]: I1002 16:39:20.174297 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:20 crc kubenswrapper[4882]: I1002 16:39:20.485776 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerStarted","Data":"7f141bb92b7087d9e34253048ef6e12f831a99f8f80791331fa9531b9666d404"} Oct 02 16:39:20 crc kubenswrapper[4882]: I1002 16:39:20.486232 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:20 crc kubenswrapper[4882]: I1002 16:39:20.486251 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:20 crc kubenswrapper[4882]: I1002 16:39:20.486603 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 16:39:20 crc kubenswrapper[4882]: I1002 16:39:20.512025 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.609848835 podStartE2EDuration="8.511999609s" podCreationTimestamp="2025-10-02 16:39:12 +0000 UTC" firstStartedPulling="2025-10-02 16:39:13.297437325 +0000 UTC m=+1312.046666862" lastFinishedPulling="2025-10-02 16:39:19.199588109 +0000 UTC m=+1317.948817636" observedRunningTime="2025-10-02 16:39:20.506688175 +0000 UTC m=+1319.255917712" watchObservedRunningTime="2025-10-02 16:39:20.511999609 +0000 UTC m=+1319.261229136" Oct 02 16:39:21 crc kubenswrapper[4882]: I1002 16:39:21.495598 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="ceilometer-central-agent" containerID="cri-o://ce8b1c55ef5df8b4b009b6a602dd32bd82bc8dde5ab83a49f49cb22de0d4304b" gracePeriod=30 Oct 02 16:39:21 crc kubenswrapper[4882]: I1002 16:39:21.496315 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="ceilometer-notification-agent" containerID="cri-o://2875a27042dc7b2dae3ebdf95fb40961c4f6b618d3612781f994f5995dabc27f" gracePeriod=30 Oct 02 16:39:21 crc kubenswrapper[4882]: I1002 16:39:21.496298 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="proxy-httpd" containerID="cri-o://7f141bb92b7087d9e34253048ef6e12f831a99f8f80791331fa9531b9666d404" gracePeriod=30 Oct 02 16:39:21 crc kubenswrapper[4882]: I1002 16:39:21.498476 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="sg-core" containerID="cri-o://69a3c9434abe928ced99cf2528160177cc2402d0addfa4f2c70bf4896a5c5965" gracePeriod=30 Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.505450 4882 generic.go:334] "Generic (PLEG): container finished" podID="a122317f-f660-4914-8e39-e800c266d5a7" containerID="7f141bb92b7087d9e34253048ef6e12f831a99f8f80791331fa9531b9666d404" exitCode=0 Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.505784 4882 generic.go:334] "Generic (PLEG): container finished" podID="a122317f-f660-4914-8e39-e800c266d5a7" containerID="69a3c9434abe928ced99cf2528160177cc2402d0addfa4f2c70bf4896a5c5965" exitCode=2 Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.505795 4882 generic.go:334] "Generic (PLEG): container finished" podID="a122317f-f660-4914-8e39-e800c266d5a7" containerID="2875a27042dc7b2dae3ebdf95fb40961c4f6b618d3612781f994f5995dabc27f" exitCode=0 Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.505802 4882 generic.go:334] "Generic (PLEG): container finished" podID="a122317f-f660-4914-8e39-e800c266d5a7" containerID="ce8b1c55ef5df8b4b009b6a602dd32bd82bc8dde5ab83a49f49cb22de0d4304b" exitCode=0 Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.505826 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerDied","Data":"7f141bb92b7087d9e34253048ef6e12f831a99f8f80791331fa9531b9666d404"} Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.505857 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerDied","Data":"69a3c9434abe928ced99cf2528160177cc2402d0addfa4f2c70bf4896a5c5965"} Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.505868 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerDied","Data":"2875a27042dc7b2dae3ebdf95fb40961c4f6b618d3612781f994f5995dabc27f"} Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.505877 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerDied","Data":"ce8b1c55ef5df8b4b009b6a602dd32bd82bc8dde5ab83a49f49cb22de0d4304b"} Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.562732 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.562877 4882 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 16:39:22 crc kubenswrapper[4882]: I1002 16:39:22.599433 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.713738 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.856765 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-sg-core-conf-yaml\") pod \"a122317f-f660-4914-8e39-e800c266d5a7\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.856861 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-scripts\") pod \"a122317f-f660-4914-8e39-e800c266d5a7\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.856943 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72l72\" (UniqueName: \"kubernetes.io/projected/a122317f-f660-4914-8e39-e800c266d5a7-kube-api-access-72l72\") pod \"a122317f-f660-4914-8e39-e800c266d5a7\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.857025 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-config-data\") pod \"a122317f-f660-4914-8e39-e800c266d5a7\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.857086 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-log-httpd\") pod \"a122317f-f660-4914-8e39-e800c266d5a7\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.857111 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-combined-ca-bundle\") pod \"a122317f-f660-4914-8e39-e800c266d5a7\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.857189 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-run-httpd\") pod \"a122317f-f660-4914-8e39-e800c266d5a7\" (UID: \"a122317f-f660-4914-8e39-e800c266d5a7\") " Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.857876 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a122317f-f660-4914-8e39-e800c266d5a7" (UID: "a122317f-f660-4914-8e39-e800c266d5a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.858538 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a122317f-f660-4914-8e39-e800c266d5a7" (UID: "a122317f-f660-4914-8e39-e800c266d5a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.858605 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.862085 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-scripts" (OuterVolumeSpecName: "scripts") pod "a122317f-f660-4914-8e39-e800c266d5a7" (UID: "a122317f-f660-4914-8e39-e800c266d5a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.862922 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a122317f-f660-4914-8e39-e800c266d5a7-kube-api-access-72l72" (OuterVolumeSpecName: "kube-api-access-72l72") pod "a122317f-f660-4914-8e39-e800c266d5a7" (UID: "a122317f-f660-4914-8e39-e800c266d5a7"). InnerVolumeSpecName "kube-api-access-72l72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.882200 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a122317f-f660-4914-8e39-e800c266d5a7" (UID: "a122317f-f660-4914-8e39-e800c266d5a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.933654 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a122317f-f660-4914-8e39-e800c266d5a7" (UID: "a122317f-f660-4914-8e39-e800c266d5a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.960383 4882 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.960421 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.960432 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72l72\" (UniqueName: \"kubernetes.io/projected/a122317f-f660-4914-8e39-e800c266d5a7-kube-api-access-72l72\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.960444 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.960453 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a122317f-f660-4914-8e39-e800c266d5a7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:25 crc kubenswrapper[4882]: I1002 16:39:25.976398 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-config-data" (OuterVolumeSpecName: "config-data") pod "a122317f-f660-4914-8e39-e800c266d5a7" (UID: "a122317f-f660-4914-8e39-e800c266d5a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.062122 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a122317f-f660-4914-8e39-e800c266d5a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.554784 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" event={"ID":"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb","Type":"ContainerStarted","Data":"78a144deaf1a06d4548c269702e27808700d9b0044a7206f272d417145ca6e45"} Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.561438 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a122317f-f660-4914-8e39-e800c266d5a7","Type":"ContainerDied","Data":"b8ed88edcbc11c83a1cc18eeb139533b261a4bed5707a5d3fd34465c58ae7097"} Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.561516 4882 scope.go:117] "RemoveContainer" containerID="7f141bb92b7087d9e34253048ef6e12f831a99f8f80791331fa9531b9666d404" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.561558 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.588394 4882 scope.go:117] "RemoveContainer" containerID="69a3c9434abe928ced99cf2528160177cc2402d0addfa4f2c70bf4896a5c5965" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.611377 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" podStartSLOduration=2.092651251 podStartE2EDuration="12.611348701s" podCreationTimestamp="2025-10-02 16:39:14 +0000 UTC" firstStartedPulling="2025-10-02 16:39:15.012432824 +0000 UTC m=+1313.761662361" lastFinishedPulling="2025-10-02 16:39:25.531130284 +0000 UTC m=+1324.280359811" observedRunningTime="2025-10-02 16:39:26.58366292 +0000 UTC m=+1325.332892457" watchObservedRunningTime="2025-10-02 16:39:26.611348701 +0000 UTC m=+1325.360578228" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.620789 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.624415 4882 scope.go:117] "RemoveContainer" containerID="2875a27042dc7b2dae3ebdf95fb40961c4f6b618d3612781f994f5995dabc27f" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.635834 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.654482 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:26 crc kubenswrapper[4882]: E1002 16:39:26.654895 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="sg-core" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.654913 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="sg-core" Oct 02 16:39:26 crc kubenswrapper[4882]: E1002 16:39:26.654930 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="proxy-httpd" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.654937 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="proxy-httpd" Oct 02 16:39:26 crc kubenswrapper[4882]: E1002 16:39:26.654946 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="ceilometer-notification-agent" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.654952 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="ceilometer-notification-agent" Oct 02 16:39:26 crc kubenswrapper[4882]: E1002 16:39:26.654973 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="ceilometer-central-agent" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.654979 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="ceilometer-central-agent" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.655154 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="sg-core" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.655175 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="ceilometer-notification-agent" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.655187 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="ceilometer-central-agent" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.655199 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="a122317f-f660-4914-8e39-e800c266d5a7" containerName="proxy-httpd" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.656909 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.660266 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.660576 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.680956 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.692452 4882 scope.go:117] "RemoveContainer" containerID="ce8b1c55ef5df8b4b009b6a602dd32bd82bc8dde5ab83a49f49cb22de0d4304b" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.774196 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a122317f-f660-4914-8e39-e800c266d5a7" path="/var/lib/kubelet/pods/a122317f-f660-4914-8e39-e800c266d5a7/volumes" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.780023 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxvj\" (UniqueName: \"kubernetes.io/projected/28e18be3-327d-42e9-a81a-39f9573b0a82-kube-api-access-mtxvj\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.780070 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-run-httpd\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.780101 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-scripts\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.780227 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-config-data\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.780260 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.780287 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.780326 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-log-httpd\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.881597 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxvj\" (UniqueName: \"kubernetes.io/projected/28e18be3-327d-42e9-a81a-39f9573b0a82-kube-api-access-mtxvj\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.881692 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-run-httpd\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.881734 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-scripts\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.881862 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-config-data\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.881892 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.881938 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.882000 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-log-httpd\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.883652 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-run-httpd\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.884870 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-log-httpd\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.890375 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.890452 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.891105 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-scripts\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.892722 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-config-data\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.911060 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxvj\" (UniqueName: \"kubernetes.io/projected/28e18be3-327d-42e9-a81a-39f9573b0a82-kube-api-access-mtxvj\") pod \"ceilometer-0\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " pod="openstack/ceilometer-0" Oct 02 16:39:26 crc kubenswrapper[4882]: I1002 16:39:26.986287 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:39:27 crc kubenswrapper[4882]: I1002 16:39:27.451952 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:39:27 crc kubenswrapper[4882]: W1002 16:39:27.453425 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28e18be3_327d_42e9_a81a_39f9573b0a82.slice/crio-e5f66b56dfc696e84ae074e4d4e7a992735234fc01fcd3ef66721fd5e7c0f9e5 WatchSource:0}: Error finding container e5f66b56dfc696e84ae074e4d4e7a992735234fc01fcd3ef66721fd5e7c0f9e5: Status 404 returned error can't find the container with id e5f66b56dfc696e84ae074e4d4e7a992735234fc01fcd3ef66721fd5e7c0f9e5 Oct 02 16:39:27 crc kubenswrapper[4882]: I1002 16:39:27.571955 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerStarted","Data":"e5f66b56dfc696e84ae074e4d4e7a992735234fc01fcd3ef66721fd5e7c0f9e5"} Oct 02 16:39:28 crc kubenswrapper[4882]: I1002 16:39:28.586644 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerStarted","Data":"8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244"} Oct 02 16:39:29 crc kubenswrapper[4882]: I1002 16:39:29.599405 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerStarted","Data":"3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50"} Oct 02 16:39:29 crc kubenswrapper[4882]: I1002 16:39:29.599963 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerStarted","Data":"0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009"} Oct 02 16:39:31 crc kubenswrapper[4882]: I1002 16:39:31.637568 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerStarted","Data":"03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b"} Oct 02 16:39:31 crc kubenswrapper[4882]: I1002 16:39:31.638403 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 16:39:31 crc kubenswrapper[4882]: I1002 16:39:31.662498 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.296055455 podStartE2EDuration="5.662471295s" podCreationTimestamp="2025-10-02 16:39:26 +0000 UTC" firstStartedPulling="2025-10-02 16:39:27.456738289 +0000 UTC m=+1326.205967816" lastFinishedPulling="2025-10-02 16:39:30.823154129 +0000 UTC m=+1329.572383656" observedRunningTime="2025-10-02 16:39:31.657845307 +0000 UTC m=+1330.407074824" watchObservedRunningTime="2025-10-02 16:39:31.662471295 +0000 UTC m=+1330.411700822" Oct 02 16:39:39 crc kubenswrapper[4882]: I1002 16:39:39.390149 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:39:39 crc kubenswrapper[4882]: I1002 16:39:39.390698 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:39:39 crc kubenswrapper[4882]: I1002 16:39:39.717040 4882 generic.go:334] "Generic (PLEG): container finished" podID="222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" containerID="78a144deaf1a06d4548c269702e27808700d9b0044a7206f272d417145ca6e45" exitCode=0 Oct 02 16:39:39 crc kubenswrapper[4882]: I1002 16:39:39.717099 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" event={"ID":"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb","Type":"ContainerDied","Data":"78a144deaf1a06d4548c269702e27808700d9b0044a7206f272d417145ca6e45"} Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.124805 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.175099 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-scripts\") pod \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.175186 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lrmn\" (UniqueName: \"kubernetes.io/projected/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-kube-api-access-5lrmn\") pod \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.175230 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-combined-ca-bundle\") pod \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.175304 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-config-data\") pod \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\" (UID: \"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb\") " Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.182410 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-scripts" (OuterVolumeSpecName: "scripts") pod "222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" (UID: "222d830a-b239-42ec-9ad1-2fb0dbe9d0fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.186458 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-kube-api-access-5lrmn" (OuterVolumeSpecName: "kube-api-access-5lrmn") pod "222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" (UID: "222d830a-b239-42ec-9ad1-2fb0dbe9d0fb"). InnerVolumeSpecName "kube-api-access-5lrmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.207820 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-config-data" (OuterVolumeSpecName: "config-data") pod "222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" (UID: "222d830a-b239-42ec-9ad1-2fb0dbe9d0fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.210474 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" (UID: "222d830a-b239-42ec-9ad1-2fb0dbe9d0fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.277252 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.277285 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lrmn\" (UniqueName: \"kubernetes.io/projected/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-kube-api-access-5lrmn\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.277299 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.277309 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.739600 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" event={"ID":"222d830a-b239-42ec-9ad1-2fb0dbe9d0fb","Type":"ContainerDied","Data":"69c4b496c7c7cc58369d9aa6a071d6efd49b847acd376cd029f6d52ab3193926"} Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.739653 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69c4b496c7c7cc58369d9aa6a071d6efd49b847acd376cd029f6d52ab3193926" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.739714 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lgmf6" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.855990 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 16:39:41 crc kubenswrapper[4882]: E1002 16:39:41.856522 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" containerName="nova-cell0-conductor-db-sync" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.856540 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" containerName="nova-cell0-conductor-db-sync" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.856776 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" containerName="nova-cell0-conductor-db-sync" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.857466 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.860618 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dc9pp" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.869017 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.870919 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.991897 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llfqf\" (UniqueName: \"kubernetes.io/projected/b883970f-7b20-4f83-9b05-3b0469caf183-kube-api-access-llfqf\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.992316 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:41 crc kubenswrapper[4882]: I1002 16:39:41.992445 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.093928 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.094014 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llfqf\" (UniqueName: \"kubernetes.io/projected/b883970f-7b20-4f83-9b05-3b0469caf183-kube-api-access-llfqf\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.094056 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.100251 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.101308 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.112895 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llfqf\" (UniqueName: \"kubernetes.io/projected/b883970f-7b20-4f83-9b05-3b0469caf183-kube-api-access-llfqf\") pod \"nova-cell0-conductor-0\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.186408 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.622728 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 16:39:42 crc kubenswrapper[4882]: I1002 16:39:42.749518 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b883970f-7b20-4f83-9b05-3b0469caf183","Type":"ContainerStarted","Data":"7cefb0afebab7524603ee415809f5c8a71b000947237e97f22da960d93f032b9"} Oct 02 16:39:43 crc kubenswrapper[4882]: I1002 16:39:43.759902 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b883970f-7b20-4f83-9b05-3b0469caf183","Type":"ContainerStarted","Data":"c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45"} Oct 02 16:39:43 crc kubenswrapper[4882]: I1002 16:39:43.761078 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:43 crc kubenswrapper[4882]: I1002 16:39:43.780706 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.780684032 podStartE2EDuration="2.780684032s" podCreationTimestamp="2025-10-02 16:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:39:43.779007379 +0000 UTC m=+1342.528236906" watchObservedRunningTime="2025-10-02 16:39:43.780684032 +0000 UTC m=+1342.529913569" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.218162 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.689032 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-w7rvw"] Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.690644 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.693260 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.709428 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.710540 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-w7rvw"] Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.805594 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-scripts\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.805663 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.805777 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqpd\" (UniqueName: \"kubernetes.io/projected/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-kube-api-access-6lqpd\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.805825 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-config-data\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.907711 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lqpd\" (UniqueName: \"kubernetes.io/projected/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-kube-api-access-6lqpd\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.907785 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-config-data\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.908032 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-scripts\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.908059 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.915963 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-config-data\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.920012 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.932063 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.933548 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-scripts\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.936644 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.946908 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.962385 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lqpd\" (UniqueName: \"kubernetes.io/projected/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-kube-api-access-6lqpd\") pod \"nova-cell0-cell-mapping-w7rvw\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.970950 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 16:39:47 crc kubenswrapper[4882]: I1002 16:39:47.972882 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.004652 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.009325 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.009632 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.009721 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfjmk\" (UniqueName: \"kubernetes.io/projected/ab19f927-457c-4ddf-a46d-fff6f76d24a0-kube-api-access-xfjmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.025239 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.118694 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.135403 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-config-data\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.135477 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.135523 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.135560 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfjmk\" (UniqueName: \"kubernetes.io/projected/ab19f927-457c-4ddf-a46d-fff6f76d24a0-kube-api-access-xfjmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.135633 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51c53801-d401-4e8e-99f4-98dfc10c2fb0-logs\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.135724 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.135771 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9xj2\" (UniqueName: \"kubernetes.io/projected/51c53801-d401-4e8e-99f4-98dfc10c2fb0-kube-api-access-l9xj2\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.237109 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51c53801-d401-4e8e-99f4-98dfc10c2fb0-logs\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.237237 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9xj2\" (UniqueName: \"kubernetes.io/projected/51c53801-d401-4e8e-99f4-98dfc10c2fb0-kube-api-access-l9xj2\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.237301 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-config-data\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.237323 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.248927 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51c53801-d401-4e8e-99f4-98dfc10c2fb0-logs\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.259124 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-config-data\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.277113 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfjmk\" (UniqueName: \"kubernetes.io/projected/ab19f927-457c-4ddf-a46d-fff6f76d24a0-kube-api-access-xfjmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.280921 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.287673 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.305121 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.330929 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.347829 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9xj2\" (UniqueName: \"kubernetes.io/projected/51c53801-d401-4e8e-99f4-98dfc10c2fb0-kube-api-access-l9xj2\") pod \"nova-api-0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.364470 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.421296 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.423099 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.428670 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.482386 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.516280 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.518034 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.522857 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.532420 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.549920 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.549964 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z4v7\" (UniqueName: \"kubernetes.io/projected/fae16249-4e2f-4325-9c4e-27062b7bb7a2-kube-api-access-9z4v7\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.550016 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-config-data\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.550411 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb949cd99-xmhnq"] Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.554294 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.567827 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb949cd99-xmhnq"] Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.605510 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.651814 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87efe738-d03e-496d-bdb4-fada78382621-logs\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.651867 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2knk\" (UniqueName: \"kubernetes.io/projected/87efe738-d03e-496d-bdb4-fada78382621-kube-api-access-b2knk\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.651920 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-config-data\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.652008 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.652035 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z4v7\" (UniqueName: \"kubernetes.io/projected/fae16249-4e2f-4325-9c4e-27062b7bb7a2-kube-api-access-9z4v7\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.652085 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.652125 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-config-data\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.658701 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.664455 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-config-data\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.679402 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z4v7\" (UniqueName: \"kubernetes.io/projected/fae16249-4e2f-4325-9c4e-27062b7bb7a2-kube-api-access-9z4v7\") pod \"nova-scheduler-0\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.754912 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.755023 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.755800 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-svc\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.755891 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87efe738-d03e-496d-bdb4-fada78382621-logs\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.756000 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2knk\" (UniqueName: \"kubernetes.io/projected/87efe738-d03e-496d-bdb4-fada78382621-kube-api-access-b2knk\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.756098 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.756188 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-config-data\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.756304 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km6ln\" (UniqueName: \"kubernetes.io/projected/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-kube-api-access-km6ln\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.756357 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.756371 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87efe738-d03e-496d-bdb4-fada78382621-logs\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.756458 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-config\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.760754 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-config-data\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.776680 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.781701 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2knk\" (UniqueName: \"kubernetes.io/projected/87efe738-d03e-496d-bdb4-fada78382621-kube-api-access-b2knk\") pod \"nova-metadata-0\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.804335 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.848176 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.859124 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.859274 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km6ln\" (UniqueName: \"kubernetes.io/projected/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-kube-api-access-km6ln\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.862375 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.862551 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-config\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.862783 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.863095 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-svc\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.863951 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.864002 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-config\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.866802 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.867858 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-svc\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.871786 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.887514 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km6ln\" (UniqueName: \"kubernetes.io/projected/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-kube-api-access-km6ln\") pod \"dnsmasq-dns-fb949cd99-xmhnq\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:48 crc kubenswrapper[4882]: I1002 16:39:48.891274 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.025689 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-w7rvw"] Oct 02 16:39:49 crc kubenswrapper[4882]: W1002 16:39:49.042577 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bf4f202_ceb4_4eb0_aa35_22313a7d95e3.slice/crio-3111b93902637949b6fa2f99995a75984a22cda0e9142701bca34b134ecaaa50 WatchSource:0}: Error finding container 3111b93902637949b6fa2f99995a75984a22cda0e9142701bca34b134ecaaa50: Status 404 returned error can't find the container with id 3111b93902637949b6fa2f99995a75984a22cda0e9142701bca34b134ecaaa50 Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.072177 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.145595 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4gj46"] Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.148081 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.156356 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.157083 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.177781 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4gj46"] Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.221721 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.279531 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-scripts\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.279680 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5c2v\" (UniqueName: \"kubernetes.io/projected/1acce7dc-c882-4dab-9e8e-09bc781e559a-kube-api-access-w5c2v\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.279719 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.279752 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-config-data\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.381448 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5c2v\" (UniqueName: \"kubernetes.io/projected/1acce7dc-c882-4dab-9e8e-09bc781e559a-kube-api-access-w5c2v\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.381534 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.381575 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-config-data\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.381619 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-scripts\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.387327 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.387345 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-config-data\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.388044 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-scripts\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.400249 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5c2v\" (UniqueName: \"kubernetes.io/projected/1acce7dc-c882-4dab-9e8e-09bc781e559a-kube-api-access-w5c2v\") pod \"nova-cell1-conductor-db-sync-4gj46\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.445357 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.614909 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:39:49 crc kubenswrapper[4882]: W1002 16:39:49.620817 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87efe738_d03e_496d_bdb4_fada78382621.slice/crio-ac50e1efd595554d1b69767643f1fd473dd40b95529fe15ffb2a801cdf4258e0 WatchSource:0}: Error finding container ac50e1efd595554d1b69767643f1fd473dd40b95529fe15ffb2a801cdf4258e0: Status 404 returned error can't find the container with id ac50e1efd595554d1b69767643f1fd473dd40b95529fe15ffb2a801cdf4258e0 Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.642453 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb949cd99-xmhnq"] Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.642802 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:39:49 crc kubenswrapper[4882]: W1002 16:39:49.664639 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac797c1e_927e_42d4_81f5_6cb74bd13e8f.slice/crio-31ae64c7f7c079bb905a1b98236bfaaf54dd5704ee9c21ffff896d3e2675c5c0 WatchSource:0}: Error finding container 31ae64c7f7c079bb905a1b98236bfaaf54dd5704ee9c21ffff896d3e2675c5c0: Status 404 returned error can't find the container with id 31ae64c7f7c079bb905a1b98236bfaaf54dd5704ee9c21ffff896d3e2675c5c0 Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.824840 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ab19f927-457c-4ddf-a46d-fff6f76d24a0","Type":"ContainerStarted","Data":"563380e48337606dc620ca37ba963d613d3426e733cede682e01401ecf01aa54"} Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.832006 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87efe738-d03e-496d-bdb4-fada78382621","Type":"ContainerStarted","Data":"ac50e1efd595554d1b69767643f1fd473dd40b95529fe15ffb2a801cdf4258e0"} Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.835692 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-w7rvw" event={"ID":"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3","Type":"ContainerStarted","Data":"cd9314b6deb15a5ba62c52e608381c59d8b1a515fc6f2ae8581e6de0e466c344"} Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.835749 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-w7rvw" event={"ID":"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3","Type":"ContainerStarted","Data":"3111b93902637949b6fa2f99995a75984a22cda0e9142701bca34b134ecaaa50"} Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.841592 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" event={"ID":"ac797c1e-927e-42d4-81f5-6cb74bd13e8f","Type":"ContainerStarted","Data":"31ae64c7f7c079bb905a1b98236bfaaf54dd5704ee9c21ffff896d3e2675c5c0"} Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.843453 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fae16249-4e2f-4325-9c4e-27062b7bb7a2","Type":"ContainerStarted","Data":"cd6c6efd062ee7d172dc24a82a83f1f512b901ff7c9f1a37c1b08a0a23ae7cce"} Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.847055 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51c53801-d401-4e8e-99f4-98dfc10c2fb0","Type":"ContainerStarted","Data":"7352f5ebbd9c23c2baec3017943639327195de21d51ae82cb2a6255dd59290de"} Oct 02 16:39:49 crc kubenswrapper[4882]: I1002 16:39:49.853357 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-w7rvw" podStartSLOduration=2.853295597 podStartE2EDuration="2.853295597s" podCreationTimestamp="2025-10-02 16:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:39:49.849149403 +0000 UTC m=+1348.598378950" watchObservedRunningTime="2025-10-02 16:39:49.853295597 +0000 UTC m=+1348.602525134" Oct 02 16:39:50 crc kubenswrapper[4882]: I1002 16:39:50.252889 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4gj46"] Oct 02 16:39:50 crc kubenswrapper[4882]: W1002 16:39:50.262623 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1acce7dc_c882_4dab_9e8e_09bc781e559a.slice/crio-e504d236c15614b4f72a50d32f2bc11701a6296b6583b457c9dbf2c9a3de81c6 WatchSource:0}: Error finding container e504d236c15614b4f72a50d32f2bc11701a6296b6583b457c9dbf2c9a3de81c6: Status 404 returned error can't find the container with id e504d236c15614b4f72a50d32f2bc11701a6296b6583b457c9dbf2c9a3de81c6 Oct 02 16:39:50 crc kubenswrapper[4882]: I1002 16:39:50.871572 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4gj46" event={"ID":"1acce7dc-c882-4dab-9e8e-09bc781e559a","Type":"ContainerStarted","Data":"5d74ea1886bb363a04bd54bc89d361229a325d82664628a7f70cd69449c9b60a"} Oct 02 16:39:50 crc kubenswrapper[4882]: I1002 16:39:50.871897 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4gj46" event={"ID":"1acce7dc-c882-4dab-9e8e-09bc781e559a","Type":"ContainerStarted","Data":"e504d236c15614b4f72a50d32f2bc11701a6296b6583b457c9dbf2c9a3de81c6"} Oct 02 16:39:50 crc kubenswrapper[4882]: I1002 16:39:50.874734 4882 generic.go:334] "Generic (PLEG): container finished" podID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" containerID="1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33" exitCode=0 Oct 02 16:39:50 crc kubenswrapper[4882]: I1002 16:39:50.876005 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" event={"ID":"ac797c1e-927e-42d4-81f5-6cb74bd13e8f","Type":"ContainerDied","Data":"1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33"} Oct 02 16:39:50 crc kubenswrapper[4882]: I1002 16:39:50.896511 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4gj46" podStartSLOduration=1.896485838 podStartE2EDuration="1.896485838s" podCreationTimestamp="2025-10-02 16:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:39:50.893511923 +0000 UTC m=+1349.642741450" watchObservedRunningTime="2025-10-02 16:39:50.896485838 +0000 UTC m=+1349.645715365" Oct 02 16:39:51 crc kubenswrapper[4882]: I1002 16:39:51.980729 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:39:52 crc kubenswrapper[4882]: I1002 16:39:52.035591 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:39:53 crc kubenswrapper[4882]: I1002 16:39:53.921384 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fae16249-4e2f-4325-9c4e-27062b7bb7a2","Type":"ContainerStarted","Data":"0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0"} Oct 02 16:39:53 crc kubenswrapper[4882]: I1002 16:39:53.955016 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.23506245 podStartE2EDuration="5.954996661s" podCreationTimestamp="2025-10-02 16:39:48 +0000 UTC" firstStartedPulling="2025-10-02 16:39:49.453400084 +0000 UTC m=+1348.202629611" lastFinishedPulling="2025-10-02 16:39:53.173334295 +0000 UTC m=+1351.922563822" observedRunningTime="2025-10-02 16:39:53.943757297 +0000 UTC m=+1352.692986844" watchObservedRunningTime="2025-10-02 16:39:53.954996661 +0000 UTC m=+1352.704226188" Oct 02 16:39:53 crc kubenswrapper[4882]: I1002 16:39:53.961539 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51c53801-d401-4e8e-99f4-98dfc10c2fb0","Type":"ContainerStarted","Data":"b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5"} Oct 02 16:39:53 crc kubenswrapper[4882]: I1002 16:39:53.996185 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ab19f927-457c-4ddf-a46d-fff6f76d24a0","Type":"ContainerStarted","Data":"eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28"} Oct 02 16:39:53 crc kubenswrapper[4882]: I1002 16:39:53.996365 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ab19f927-457c-4ddf-a46d-fff6f76d24a0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28" gracePeriod=30 Oct 02 16:39:54 crc kubenswrapper[4882]: I1002 16:39:54.025902 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="87efe738-d03e-496d-bdb4-fada78382621" containerName="nova-metadata-log" containerID="cri-o://de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0" gracePeriod=30 Oct 02 16:39:54 crc kubenswrapper[4882]: I1002 16:39:54.026209 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87efe738-d03e-496d-bdb4-fada78382621","Type":"ContainerStarted","Data":"de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0"} Oct 02 16:39:54 crc kubenswrapper[4882]: I1002 16:39:54.026286 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="87efe738-d03e-496d-bdb4-fada78382621" containerName="nova-metadata-metadata" containerID="cri-o://67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e" gracePeriod=30 Oct 02 16:39:54 crc kubenswrapper[4882]: I1002 16:39:54.060105 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.030346845 podStartE2EDuration="7.060082669s" podCreationTimestamp="2025-10-02 16:39:47 +0000 UTC" firstStartedPulling="2025-10-02 16:39:49.131844874 +0000 UTC m=+1347.881074401" lastFinishedPulling="2025-10-02 16:39:53.161580698 +0000 UTC m=+1351.910810225" observedRunningTime="2025-10-02 16:39:54.055499813 +0000 UTC m=+1352.804729340" watchObservedRunningTime="2025-10-02 16:39:54.060082669 +0000 UTC m=+1352.809312196" Oct 02 16:39:54 crc kubenswrapper[4882]: I1002 16:39:54.061751 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" event={"ID":"ac797c1e-927e-42d4-81f5-6cb74bd13e8f","Type":"ContainerStarted","Data":"64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473"} Oct 02 16:39:54 crc kubenswrapper[4882]: I1002 16:39:54.062935 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:54 crc kubenswrapper[4882]: I1002 16:39:54.096128 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.573177422 podStartE2EDuration="6.09610402s" podCreationTimestamp="2025-10-02 16:39:48 +0000 UTC" firstStartedPulling="2025-10-02 16:39:49.639838689 +0000 UTC m=+1348.389068216" lastFinishedPulling="2025-10-02 16:39:53.162765297 +0000 UTC m=+1351.911994814" observedRunningTime="2025-10-02 16:39:54.07673992 +0000 UTC m=+1352.825969447" watchObservedRunningTime="2025-10-02 16:39:54.09610402 +0000 UTC m=+1352.845333547" Oct 02 16:39:54 crc kubenswrapper[4882]: I1002 16:39:54.142785 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" podStartSLOduration=6.14276087 podStartE2EDuration="6.14276087s" podCreationTimestamp="2025-10-02 16:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:39:54.132162392 +0000 UTC m=+1352.881391919" watchObservedRunningTime="2025-10-02 16:39:54.14276087 +0000 UTC m=+1352.891990397" Oct 02 16:39:55 crc kubenswrapper[4882]: I1002 16:39:55.072374 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51c53801-d401-4e8e-99f4-98dfc10c2fb0","Type":"ContainerStarted","Data":"946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68"} Oct 02 16:39:55 crc kubenswrapper[4882]: I1002 16:39:55.076452 4882 generic.go:334] "Generic (PLEG): container finished" podID="87efe738-d03e-496d-bdb4-fada78382621" containerID="de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0" exitCode=143 Oct 02 16:39:55 crc kubenswrapper[4882]: I1002 16:39:55.076594 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87efe738-d03e-496d-bdb4-fada78382621","Type":"ContainerDied","Data":"de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0"} Oct 02 16:39:55 crc kubenswrapper[4882]: I1002 16:39:55.076648 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87efe738-d03e-496d-bdb4-fada78382621","Type":"ContainerStarted","Data":"67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e"} Oct 02 16:39:55 crc kubenswrapper[4882]: I1002 16:39:55.091608 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.115504246 podStartE2EDuration="8.091586894s" podCreationTimestamp="2025-10-02 16:39:47 +0000 UTC" firstStartedPulling="2025-10-02 16:39:49.186245278 +0000 UTC m=+1347.935474805" lastFinishedPulling="2025-10-02 16:39:53.162327926 +0000 UTC m=+1351.911557453" observedRunningTime="2025-10-02 16:39:55.089505632 +0000 UTC m=+1353.838735169" watchObservedRunningTime="2025-10-02 16:39:55.091586894 +0000 UTC m=+1353.840816421" Oct 02 16:39:56 crc kubenswrapper[4882]: I1002 16:39:56.992051 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.105197 4882 generic.go:334] "Generic (PLEG): container finished" podID="2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" containerID="cd9314b6deb15a5ba62c52e608381c59d8b1a515fc6f2ae8581e6de0e466c344" exitCode=0 Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.105244 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-w7rvw" event={"ID":"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3","Type":"ContainerDied","Data":"cd9314b6deb15a5ba62c52e608381c59d8b1a515fc6f2ae8581e6de0e466c344"} Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.365442 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.606258 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.606323 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.805086 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.805154 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.835750 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.849023 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.849089 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.893431 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.952680 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9668db97-bkpfp"] Oct 02 16:39:58 crc kubenswrapper[4882]: I1002 16:39:58.953434 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" podUID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" containerName="dnsmasq-dns" containerID="cri-o://797e8966d319d2ccb0ca1e140cb7f470e7fa35927face6c1c5d2f62e9ad7ed72" gracePeriod=10 Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.120051 4882 generic.go:334] "Generic (PLEG): container finished" podID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" containerID="797e8966d319d2ccb0ca1e140cb7f470e7fa35927face6c1c5d2f62e9ad7ed72" exitCode=0 Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.120279 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" event={"ID":"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3","Type":"ContainerDied","Data":"797e8966d319d2ccb0ca1e140cb7f470e7fa35927face6c1c5d2f62e9ad7ed72"} Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.178195 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.620940 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.621752 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.688649 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.689303 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.745313 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-nb\") pod \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.745765 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-scripts\") pod \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.745807 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-swift-storage-0\") pod \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.745859 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lqpd\" (UniqueName: \"kubernetes.io/projected/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-kube-api-access-6lqpd\") pod \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.745885 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-config\") pod \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.745912 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-svc\") pod \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.745995 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-combined-ca-bundle\") pod \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.746037 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-config-data\") pod \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\" (UID: \"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.746085 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvpd2\" (UniqueName: \"kubernetes.io/projected/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-kube-api-access-bvpd2\") pod \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.746126 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-sb\") pod \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\" (UID: \"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3\") " Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.751624 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-scripts" (OuterVolumeSpecName: "scripts") pod "2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" (UID: "2bf4f202-ceb4-4eb0-aa35-22313a7d95e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.755435 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-kube-api-access-6lqpd" (OuterVolumeSpecName: "kube-api-access-6lqpd") pod "2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" (UID: "2bf4f202-ceb4-4eb0-aa35-22313a7d95e3"). InnerVolumeSpecName "kube-api-access-6lqpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.755489 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-kube-api-access-bvpd2" (OuterVolumeSpecName: "kube-api-access-bvpd2") pod "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" (UID: "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3"). InnerVolumeSpecName "kube-api-access-bvpd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.783863 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-config-data" (OuterVolumeSpecName: "config-data") pod "2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" (UID: "2bf4f202-ceb4-4eb0-aa35-22313a7d95e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.811358 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" (UID: "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.825010 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" (UID: "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.826171 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" (UID: "2bf4f202-ceb4-4eb0-aa35-22313a7d95e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.842105 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" (UID: "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.844988 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" (UID: "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848317 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848363 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848379 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lqpd\" (UniqueName: \"kubernetes.io/projected/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-kube-api-access-6lqpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848390 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848402 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848415 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848426 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvpd2\" (UniqueName: \"kubernetes.io/projected/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-kube-api-access-bvpd2\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848436 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.848447 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.869274 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-config" (OuterVolumeSpecName: "config") pod "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" (UID: "ecbe44a6-e3be-4874-ac22-63f1e5fc34e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:39:59 crc kubenswrapper[4882]: I1002 16:39:59.950908 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.132482 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" event={"ID":"ecbe44a6-e3be-4874-ac22-63f1e5fc34e3","Type":"ContainerDied","Data":"ebea4ec3ada93fda42e4ab51c78fb36211299b22c05befc6729a80e75b54c33a"} Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.132558 4882 scope.go:117] "RemoveContainer" containerID="797e8966d319d2ccb0ca1e140cb7f470e7fa35927face6c1c5d2f62e9ad7ed72" Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.132737 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9668db97-bkpfp" Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.149058 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-w7rvw" Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.149123 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-w7rvw" event={"ID":"2bf4f202-ceb4-4eb0-aa35-22313a7d95e3","Type":"ContainerDied","Data":"3111b93902637949b6fa2f99995a75984a22cda0e9142701bca34b134ecaaa50"} Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.149178 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3111b93902637949b6fa2f99995a75984a22cda0e9142701bca34b134ecaaa50" Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.185187 4882 scope.go:117] "RemoveContainer" containerID="f1d51776d3fb2ce27054a65d1e19cd19651fc285559fccdd2272d4ef37ee2f07" Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.199256 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9668db97-bkpfp"] Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.216936 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d9668db97-bkpfp"] Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.305001 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.305316 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-log" containerID="cri-o://b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5" gracePeriod=30 Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.305372 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-api" containerID="cri-o://946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68" gracePeriod=30 Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.325549 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:00 crc kubenswrapper[4882]: I1002 16:40:00.772441 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" path="/var/lib/kubelet/pods/ecbe44a6-e3be-4874-ac22-63f1e5fc34e3/volumes" Oct 02 16:40:01 crc kubenswrapper[4882]: I1002 16:40:01.157576 4882 generic.go:334] "Generic (PLEG): container finished" podID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerID="b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5" exitCode=143 Oct 02 16:40:01 crc kubenswrapper[4882]: I1002 16:40:01.157994 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51c53801-d401-4e8e-99f4-98dfc10c2fb0","Type":"ContainerDied","Data":"b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5"} Oct 02 16:40:01 crc kubenswrapper[4882]: I1002 16:40:01.159079 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fae16249-4e2f-4325-9c4e-27062b7bb7a2" containerName="nova-scheduler-scheduler" containerID="cri-o://0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0" gracePeriod=30 Oct 02 16:40:01 crc kubenswrapper[4882]: I1002 16:40:01.726993 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:40:01 crc kubenswrapper[4882]: I1002 16:40:01.727594 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5d019714-550c-4da5-9d79-8bd03c1cb2f6" containerName="kube-state-metrics" containerID="cri-o://fd3d1168a9052070845923b9e8dea6c765e24e4825cb3252fba286141fb23f1a" gracePeriod=30 Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.170761 4882 generic.go:334] "Generic (PLEG): container finished" podID="1acce7dc-c882-4dab-9e8e-09bc781e559a" containerID="5d74ea1886bb363a04bd54bc89d361229a325d82664628a7f70cd69449c9b60a" exitCode=0 Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.170822 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4gj46" event={"ID":"1acce7dc-c882-4dab-9e8e-09bc781e559a","Type":"ContainerDied","Data":"5d74ea1886bb363a04bd54bc89d361229a325d82664628a7f70cd69449c9b60a"} Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.173038 4882 generic.go:334] "Generic (PLEG): container finished" podID="5d019714-550c-4da5-9d79-8bd03c1cb2f6" containerID="fd3d1168a9052070845923b9e8dea6c765e24e4825cb3252fba286141fb23f1a" exitCode=2 Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.173083 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5d019714-550c-4da5-9d79-8bd03c1cb2f6","Type":"ContainerDied","Data":"fd3d1168a9052070845923b9e8dea6c765e24e4825cb3252fba286141fb23f1a"} Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.173104 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5d019714-550c-4da5-9d79-8bd03c1cb2f6","Type":"ContainerDied","Data":"255016af16e080e283a42effac070fb40b9888758df49e3ee094f4efe07d1222"} Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.173117 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="255016af16e080e283a42effac070fb40b9888758df49e3ee094f4efe07d1222" Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.249154 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.401380 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlv5n\" (UniqueName: \"kubernetes.io/projected/5d019714-550c-4da5-9d79-8bd03c1cb2f6-kube-api-access-nlv5n\") pod \"5d019714-550c-4da5-9d79-8bd03c1cb2f6\" (UID: \"5d019714-550c-4da5-9d79-8bd03c1cb2f6\") " Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.407040 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d019714-550c-4da5-9d79-8bd03c1cb2f6-kube-api-access-nlv5n" (OuterVolumeSpecName: "kube-api-access-nlv5n") pod "5d019714-550c-4da5-9d79-8bd03c1cb2f6" (UID: "5d019714-550c-4da5-9d79-8bd03c1cb2f6"). InnerVolumeSpecName "kube-api-access-nlv5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:02 crc kubenswrapper[4882]: I1002 16:40:02.504280 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlv5n\" (UniqueName: \"kubernetes.io/projected/5d019714-550c-4da5-9d79-8bd03c1cb2f6-kube-api-access-nlv5n\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.181592 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.219284 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.228101 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.238135 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:40:03 crc kubenswrapper[4882]: E1002 16:40:03.238627 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d019714-550c-4da5-9d79-8bd03c1cb2f6" containerName="kube-state-metrics" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.238649 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d019714-550c-4da5-9d79-8bd03c1cb2f6" containerName="kube-state-metrics" Oct 02 16:40:03 crc kubenswrapper[4882]: E1002 16:40:03.238665 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" containerName="nova-manage" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.238672 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" containerName="nova-manage" Oct 02 16:40:03 crc kubenswrapper[4882]: E1002 16:40:03.238688 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" containerName="dnsmasq-dns" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.238694 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" containerName="dnsmasq-dns" Oct 02 16:40:03 crc kubenswrapper[4882]: E1002 16:40:03.238719 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" containerName="init" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.238725 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" containerName="init" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.238927 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbe44a6-e3be-4874-ac22-63f1e5fc34e3" containerName="dnsmasq-dns" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.238936 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" containerName="nova-manage" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.238949 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d019714-550c-4da5-9d79-8bd03c1cb2f6" containerName="kube-state-metrics" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.239666 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.242417 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.242587 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.253363 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.421821 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.421902 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.421930 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbl5n\" (UniqueName: \"kubernetes.io/projected/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-api-access-gbl5n\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.422001 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.523868 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.523952 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.523990 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbl5n\" (UniqueName: \"kubernetes.io/projected/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-api-access-gbl5n\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.524085 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.530391 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.530763 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.532719 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.546947 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbl5n\" (UniqueName: \"kubernetes.io/projected/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-api-access-gbl5n\") pod \"kube-state-metrics-0\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.576094 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.633449 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.633763 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="ceilometer-central-agent" containerID="cri-o://8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244" gracePeriod=30 Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.634197 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="proxy-httpd" containerID="cri-o://03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b" gracePeriod=30 Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.634267 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="sg-core" containerID="cri-o://3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50" gracePeriod=30 Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.634330 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="ceilometer-notification-agent" containerID="cri-o://0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009" gracePeriod=30 Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.740453 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:40:03 crc kubenswrapper[4882]: E1002 16:40:03.807122 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:40:03 crc kubenswrapper[4882]: E1002 16:40:03.808714 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:40:03 crc kubenswrapper[4882]: E1002 16:40:03.809996 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:40:03 crc kubenswrapper[4882]: E1002 16:40:03.810074 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fae16249-4e2f-4325-9c4e-27062b7bb7a2" containerName="nova-scheduler-scheduler" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.941861 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-config-data\") pod \"1acce7dc-c882-4dab-9e8e-09bc781e559a\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.941948 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5c2v\" (UniqueName: \"kubernetes.io/projected/1acce7dc-c882-4dab-9e8e-09bc781e559a-kube-api-access-w5c2v\") pod \"1acce7dc-c882-4dab-9e8e-09bc781e559a\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.941980 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-scripts\") pod \"1acce7dc-c882-4dab-9e8e-09bc781e559a\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.942668 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-combined-ca-bundle\") pod \"1acce7dc-c882-4dab-9e8e-09bc781e559a\" (UID: \"1acce7dc-c882-4dab-9e8e-09bc781e559a\") " Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.947760 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1acce7dc-c882-4dab-9e8e-09bc781e559a-kube-api-access-w5c2v" (OuterVolumeSpecName: "kube-api-access-w5c2v") pod "1acce7dc-c882-4dab-9e8e-09bc781e559a" (UID: "1acce7dc-c882-4dab-9e8e-09bc781e559a"). InnerVolumeSpecName "kube-api-access-w5c2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.948277 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-scripts" (OuterVolumeSpecName: "scripts") pod "1acce7dc-c882-4dab-9e8e-09bc781e559a" (UID: "1acce7dc-c882-4dab-9e8e-09bc781e559a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.970461 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-config-data" (OuterVolumeSpecName: "config-data") pod "1acce7dc-c882-4dab-9e8e-09bc781e559a" (UID: "1acce7dc-c882-4dab-9e8e-09bc781e559a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:03 crc kubenswrapper[4882]: I1002 16:40:03.972144 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1acce7dc-c882-4dab-9e8e-09bc781e559a" (UID: "1acce7dc-c882-4dab-9e8e-09bc781e559a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.044914 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5c2v\" (UniqueName: \"kubernetes.io/projected/1acce7dc-c882-4dab-9e8e-09bc781e559a-kube-api-access-w5c2v\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.044955 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.044964 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.045027 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1acce7dc-c882-4dab-9e8e-09bc781e559a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.073943 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.084110 4882 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.198362 4882 generic.go:334] "Generic (PLEG): container finished" podID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerID="03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b" exitCode=0 Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.198411 4882 generic.go:334] "Generic (PLEG): container finished" podID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerID="3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50" exitCode=2 Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.198423 4882 generic.go:334] "Generic (PLEG): container finished" podID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerID="8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244" exitCode=0 Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.198487 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerDied","Data":"03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b"} Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.198526 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerDied","Data":"3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50"} Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.198536 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerDied","Data":"8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244"} Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.200766 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4gj46" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.201676 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4gj46" event={"ID":"1acce7dc-c882-4dab-9e8e-09bc781e559a","Type":"ContainerDied","Data":"e504d236c15614b4f72a50d32f2bc11701a6296b6583b457c9dbf2c9a3de81c6"} Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.201775 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e504d236c15614b4f72a50d32f2bc11701a6296b6583b457c9dbf2c9a3de81c6" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.203122 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7b05798-486d-493e-90eb-c09edb4bdc96","Type":"ContainerStarted","Data":"b1d9df38946f9d4ea04e197424306a3777d07e8fb20beef0551267a3070be30e"} Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.270993 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 16:40:04 crc kubenswrapper[4882]: E1002 16:40:04.271414 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1acce7dc-c882-4dab-9e8e-09bc781e559a" containerName="nova-cell1-conductor-db-sync" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.271430 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acce7dc-c882-4dab-9e8e-09bc781e559a" containerName="nova-cell1-conductor-db-sync" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.271630 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="1acce7dc-c882-4dab-9e8e-09bc781e559a" containerName="nova-cell1-conductor-db-sync" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.272417 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.276170 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.283638 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.349457 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gnrb\" (UniqueName: \"kubernetes.io/projected/bd2c7127-3671-4e79-aad9-01146803019e-kube-api-access-2gnrb\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.349748 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.349783 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.451286 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.451335 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gnrb\" (UniqueName: \"kubernetes.io/projected/bd2c7127-3671-4e79-aad9-01146803019e-kube-api-access-2gnrb\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.451378 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.456125 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.456182 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.472195 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gnrb\" (UniqueName: \"kubernetes.io/projected/bd2c7127-3671-4e79-aad9-01146803019e-kube-api-access-2gnrb\") pod \"nova-cell1-conductor-0\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.607862 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:04 crc kubenswrapper[4882]: I1002 16:40:04.779151 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d019714-550c-4da5-9d79-8bd03c1cb2f6" path="/var/lib/kubelet/pods/5d019714-550c-4da5-9d79-8bd03c1cb2f6/volumes" Oct 02 16:40:05 crc kubenswrapper[4882]: W1002 16:40:05.062236 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd2c7127_3671_4e79_aad9_01146803019e.slice/crio-8dcaaec2f7f80279d6258eff133f23d505f66b42f0440989531113ab2d0a73c5 WatchSource:0}: Error finding container 8dcaaec2f7f80279d6258eff133f23d505f66b42f0440989531113ab2d0a73c5: Status 404 returned error can't find the container with id 8dcaaec2f7f80279d6258eff133f23d505f66b42f0440989531113ab2d0a73c5 Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.064120 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.205778 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.217292 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bd2c7127-3671-4e79-aad9-01146803019e","Type":"ContainerStarted","Data":"8dcaaec2f7f80279d6258eff133f23d505f66b42f0440989531113ab2d0a73c5"} Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.219023 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7b05798-486d-493e-90eb-c09edb4bdc96","Type":"ContainerStarted","Data":"f281d92386df9b537dc64cd1c262d2a95b35ba4313d97c921f09342506f64857"} Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.219335 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.228739 4882 generic.go:334] "Generic (PLEG): container finished" podID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerID="0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009" exitCode=0 Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.228807 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerDied","Data":"0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009"} Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.228822 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.228839 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28e18be3-327d-42e9-a81a-39f9573b0a82","Type":"ContainerDied","Data":"e5f66b56dfc696e84ae074e4d4e7a992735234fc01fcd3ef66721fd5e7c0f9e5"} Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.228860 4882 scope.go:117] "RemoveContainer" containerID="03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.266047 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.834249633 podStartE2EDuration="2.26602562s" podCreationTimestamp="2025-10-02 16:40:03 +0000 UTC" firstStartedPulling="2025-10-02 16:40:04.08384073 +0000 UTC m=+1362.833070257" lastFinishedPulling="2025-10-02 16:40:04.515616717 +0000 UTC m=+1363.264846244" observedRunningTime="2025-10-02 16:40:05.247729025 +0000 UTC m=+1363.996958552" watchObservedRunningTime="2025-10-02 16:40:05.26602562 +0000 UTC m=+1364.015255147" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.280543 4882 scope.go:117] "RemoveContainer" containerID="3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.315127 4882 scope.go:117] "RemoveContainer" containerID="0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.343670 4882 scope.go:117] "RemoveContainer" containerID="8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.366761 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-combined-ca-bundle\") pod \"28e18be3-327d-42e9-a81a-39f9573b0a82\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.366848 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-scripts\") pod \"28e18be3-327d-42e9-a81a-39f9573b0a82\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.366892 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-run-httpd\") pod \"28e18be3-327d-42e9-a81a-39f9573b0a82\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.366925 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtxvj\" (UniqueName: \"kubernetes.io/projected/28e18be3-327d-42e9-a81a-39f9573b0a82-kube-api-access-mtxvj\") pod \"28e18be3-327d-42e9-a81a-39f9573b0a82\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.366975 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-config-data\") pod \"28e18be3-327d-42e9-a81a-39f9573b0a82\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.367024 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-sg-core-conf-yaml\") pod \"28e18be3-327d-42e9-a81a-39f9573b0a82\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.367050 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-log-httpd\") pod \"28e18be3-327d-42e9-a81a-39f9573b0a82\" (UID: \"28e18be3-327d-42e9-a81a-39f9573b0a82\") " Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.367368 4882 scope.go:117] "RemoveContainer" containerID="03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.367612 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "28e18be3-327d-42e9-a81a-39f9573b0a82" (UID: "28e18be3-327d-42e9-a81a-39f9573b0a82"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.367971 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "28e18be3-327d-42e9-a81a-39f9573b0a82" (UID: "28e18be3-327d-42e9-a81a-39f9573b0a82"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:05 crc kubenswrapper[4882]: E1002 16:40:05.368449 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b\": container with ID starting with 03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b not found: ID does not exist" containerID="03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.368507 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b"} err="failed to get container status \"03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b\": rpc error: code = NotFound desc = could not find container \"03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b\": container with ID starting with 03f4918866baec4b4d63f5bcd1c6fede1ed1b70829bab45b9f2fdaae4e59099b not found: ID does not exist" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.368541 4882 scope.go:117] "RemoveContainer" containerID="3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50" Oct 02 16:40:05 crc kubenswrapper[4882]: E1002 16:40:05.369029 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50\": container with ID starting with 3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50 not found: ID does not exist" containerID="3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.369070 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50"} err="failed to get container status \"3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50\": rpc error: code = NotFound desc = could not find container \"3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50\": container with ID starting with 3dd6934cca01264dc30bcd9e00b7168e5dd0acb7acdc6abbf4b1c871cc79ed50 not found: ID does not exist" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.369100 4882 scope.go:117] "RemoveContainer" containerID="0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009" Oct 02 16:40:05 crc kubenswrapper[4882]: E1002 16:40:05.370159 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009\": container with ID starting with 0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009 not found: ID does not exist" containerID="0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.370198 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009"} err="failed to get container status \"0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009\": rpc error: code = NotFound desc = could not find container \"0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009\": container with ID starting with 0d0c8e30e87c6b446775d88a1b9cf7409b91e3a1867f9f776ccbcc03d1e22009 not found: ID does not exist" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.370233 4882 scope.go:117] "RemoveContainer" containerID="8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244" Oct 02 16:40:05 crc kubenswrapper[4882]: E1002 16:40:05.370555 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244\": container with ID starting with 8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244 not found: ID does not exist" containerID="8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.370593 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244"} err="failed to get container status \"8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244\": rpc error: code = NotFound desc = could not find container \"8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244\": container with ID starting with 8f661e3b279393afe1421daff75408aca2702e2e8ab2299ede427c936a5e9244 not found: ID does not exist" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.371074 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-scripts" (OuterVolumeSpecName: "scripts") pod "28e18be3-327d-42e9-a81a-39f9573b0a82" (UID: "28e18be3-327d-42e9-a81a-39f9573b0a82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.372458 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e18be3-327d-42e9-a81a-39f9573b0a82-kube-api-access-mtxvj" (OuterVolumeSpecName: "kube-api-access-mtxvj") pod "28e18be3-327d-42e9-a81a-39f9573b0a82" (UID: "28e18be3-327d-42e9-a81a-39f9573b0a82"). InnerVolumeSpecName "kube-api-access-mtxvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.399989 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "28e18be3-327d-42e9-a81a-39f9573b0a82" (UID: "28e18be3-327d-42e9-a81a-39f9573b0a82"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.443286 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28e18be3-327d-42e9-a81a-39f9573b0a82" (UID: "28e18be3-327d-42e9-a81a-39f9573b0a82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.462432 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-config-data" (OuterVolumeSpecName: "config-data") pod "28e18be3-327d-42e9-a81a-39f9573b0a82" (UID: "28e18be3-327d-42e9-a81a-39f9573b0a82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.470643 4882 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.471017 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.471033 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.471047 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.471332 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28e18be3-327d-42e9-a81a-39f9573b0a82-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.471346 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtxvj\" (UniqueName: \"kubernetes.io/projected/28e18be3-327d-42e9-a81a-39f9573b0a82-kube-api-access-mtxvj\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.471362 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e18be3-327d-42e9-a81a-39f9573b0a82-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.649773 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.679162 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697145 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:05 crc kubenswrapper[4882]: E1002 16:40:05.697584 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="ceilometer-notification-agent" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697602 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="ceilometer-notification-agent" Oct 02 16:40:05 crc kubenswrapper[4882]: E1002 16:40:05.697638 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="sg-core" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697645 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="sg-core" Oct 02 16:40:05 crc kubenswrapper[4882]: E1002 16:40:05.697659 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="proxy-httpd" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697665 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="proxy-httpd" Oct 02 16:40:05 crc kubenswrapper[4882]: E1002 16:40:05.697683 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="ceilometer-central-agent" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697690 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="ceilometer-central-agent" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697904 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="ceilometer-central-agent" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697919 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="sg-core" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697931 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="proxy-httpd" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.697951 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" containerName="ceilometer-notification-agent" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.699698 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.702275 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.702461 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.702592 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.708283 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.777476 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-config-data\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.777584 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.777640 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.777709 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.777742 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65pqc\" (UniqueName: \"kubernetes.io/projected/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-kube-api-access-65pqc\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.777904 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-log-httpd\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.777993 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-run-httpd\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.778121 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-scripts\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.880979 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-scripts\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.881040 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-config-data\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.881204 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.881296 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.881477 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.881526 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65pqc\" (UniqueName: \"kubernetes.io/projected/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-kube-api-access-65pqc\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.881609 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-log-httpd\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.881666 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-run-httpd\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.883076 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-run-httpd\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.890892 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.893735 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.898897 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-log-httpd\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.903493 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-config-data\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.904506 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65pqc\" (UniqueName: \"kubernetes.io/projected/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-kube-api-access-65pqc\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.906490 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.907073 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-scripts\") pod \"ceilometer-0\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " pod="openstack/ceilometer-0" Oct 02 16:40:05 crc kubenswrapper[4882]: I1002 16:40:05.990401 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.028101 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.087746 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-config-data\") pod \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.087918 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-combined-ca-bundle\") pod \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.089584 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z4v7\" (UniqueName: \"kubernetes.io/projected/fae16249-4e2f-4325-9c4e-27062b7bb7a2-kube-api-access-9z4v7\") pod \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\" (UID: \"fae16249-4e2f-4325-9c4e-27062b7bb7a2\") " Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.096835 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae16249-4e2f-4325-9c4e-27062b7bb7a2-kube-api-access-9z4v7" (OuterVolumeSpecName: "kube-api-access-9z4v7") pod "fae16249-4e2f-4325-9c4e-27062b7bb7a2" (UID: "fae16249-4e2f-4325-9c4e-27062b7bb7a2"). InnerVolumeSpecName "kube-api-access-9z4v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.120163 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-config-data" (OuterVolumeSpecName: "config-data") pod "fae16249-4e2f-4325-9c4e-27062b7bb7a2" (UID: "fae16249-4e2f-4325-9c4e-27062b7bb7a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.124368 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fae16249-4e2f-4325-9c4e-27062b7bb7a2" (UID: "fae16249-4e2f-4325-9c4e-27062b7bb7a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.156974 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.192784 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.192857 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z4v7\" (UniqueName: \"kubernetes.io/projected/fae16249-4e2f-4325-9c4e-27062b7bb7a2-kube-api-access-9z4v7\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.192901 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae16249-4e2f-4325-9c4e-27062b7bb7a2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.251887 4882 generic.go:334] "Generic (PLEG): container finished" podID="fae16249-4e2f-4325-9c4e-27062b7bb7a2" containerID="0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0" exitCode=0 Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.252021 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.258375 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fae16249-4e2f-4325-9c4e-27062b7bb7a2","Type":"ContainerDied","Data":"0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0"} Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.258429 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fae16249-4e2f-4325-9c4e-27062b7bb7a2","Type":"ContainerDied","Data":"cd6c6efd062ee7d172dc24a82a83f1f512b901ff7c9f1a37c1b08a0a23ae7cce"} Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.258460 4882 scope.go:117] "RemoveContainer" containerID="0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.265651 4882 generic.go:334] "Generic (PLEG): container finished" podID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerID="946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68" exitCode=0 Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.265739 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51c53801-d401-4e8e-99f4-98dfc10c2fb0","Type":"ContainerDied","Data":"946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68"} Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.265847 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51c53801-d401-4e8e-99f4-98dfc10c2fb0","Type":"ContainerDied","Data":"7352f5ebbd9c23c2baec3017943639327195de21d51ae82cb2a6255dd59290de"} Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.266003 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.271843 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bd2c7127-3671-4e79-aad9-01146803019e","Type":"ContainerStarted","Data":"7d2d4ce2c2e6521a23764d7d8191e9ba960363470732242e3f515311010cfa5e"} Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.273175 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.294703 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-combined-ca-bundle\") pod \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.294875 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-config-data\") pod \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.294921 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9xj2\" (UniqueName: \"kubernetes.io/projected/51c53801-d401-4e8e-99f4-98dfc10c2fb0-kube-api-access-l9xj2\") pod \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.294997 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51c53801-d401-4e8e-99f4-98dfc10c2fb0-logs\") pod \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\" (UID: \"51c53801-d401-4e8e-99f4-98dfc10c2fb0\") " Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.298174 4882 scope.go:117] "RemoveContainer" containerID="0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.298476 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51c53801-d401-4e8e-99f4-98dfc10c2fb0-logs" (OuterVolumeSpecName: "logs") pod "51c53801-d401-4e8e-99f4-98dfc10c2fb0" (UID: "51c53801-d401-4e8e-99f4-98dfc10c2fb0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:06 crc kubenswrapper[4882]: E1002 16:40:06.302711 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0\": container with ID starting with 0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0 not found: ID does not exist" containerID="0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.302799 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0"} err="failed to get container status \"0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0\": rpc error: code = NotFound desc = could not find container \"0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0\": container with ID starting with 0addd966ce4407b224b8a7540f5f864505edf8f862bab81febac78ce1a5a5fb0 not found: ID does not exist" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.302867 4882 scope.go:117] "RemoveContainer" containerID="946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.302087 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.302066919 podStartE2EDuration="2.302066919s" podCreationTimestamp="2025-10-02 16:40:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:06.293766962 +0000 UTC m=+1365.042996489" watchObservedRunningTime="2025-10-02 16:40:06.302066919 +0000 UTC m=+1365.051296446" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.319103 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c53801-d401-4e8e-99f4-98dfc10c2fb0-kube-api-access-l9xj2" (OuterVolumeSpecName: "kube-api-access-l9xj2") pod "51c53801-d401-4e8e-99f4-98dfc10c2fb0" (UID: "51c53801-d401-4e8e-99f4-98dfc10c2fb0"). InnerVolumeSpecName "kube-api-access-l9xj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.343510 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.348456 4882 scope.go:117] "RemoveContainer" containerID="b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.360310 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51c53801-d401-4e8e-99f4-98dfc10c2fb0" (UID: "51c53801-d401-4e8e-99f4-98dfc10c2fb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.371235 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.371360 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-config-data" (OuterVolumeSpecName: "config-data") pod "51c53801-d401-4e8e-99f4-98dfc10c2fb0" (UID: "51c53801-d401-4e8e-99f4-98dfc10c2fb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.379928 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: E1002 16:40:06.380470 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae16249-4e2f-4325-9c4e-27062b7bb7a2" containerName="nova-scheduler-scheduler" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.380491 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae16249-4e2f-4325-9c4e-27062b7bb7a2" containerName="nova-scheduler-scheduler" Oct 02 16:40:06 crc kubenswrapper[4882]: E1002 16:40:06.380507 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-log" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.380513 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-log" Oct 02 16:40:06 crc kubenswrapper[4882]: E1002 16:40:06.380538 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-api" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.380544 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-api" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.380723 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae16249-4e2f-4325-9c4e-27062b7bb7a2" containerName="nova-scheduler-scheduler" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.380747 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-log" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.380757 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" containerName="nova-api-api" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.381468 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.382377 4882 scope.go:117] "RemoveContainer" containerID="946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68" Oct 02 16:40:06 crc kubenswrapper[4882]: E1002 16:40:06.383970 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68\": container with ID starting with 946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68 not found: ID does not exist" containerID="946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.384012 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68"} err="failed to get container status \"946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68\": rpc error: code = NotFound desc = could not find container \"946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68\": container with ID starting with 946203f6d6de5880250bc6bac07d602f6203940d8dbc66ca8754153d13673f68 not found: ID does not exist" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.384049 4882 scope.go:117] "RemoveContainer" containerID="b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.384182 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 16:40:06 crc kubenswrapper[4882]: E1002 16:40:06.385485 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5\": container with ID starting with b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5 not found: ID does not exist" containerID="b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.385527 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5"} err="failed to get container status \"b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5\": rpc error: code = NotFound desc = could not find container \"b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5\": container with ID starting with b628a90fd6981a9218393014c15c9b80d2f1ab9c83e5397386879b9684dc4fc5 not found: ID does not exist" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.398300 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.399888 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9xj2\" (UniqueName: \"kubernetes.io/projected/51c53801-d401-4e8e-99f4-98dfc10c2fb0-kube-api-access-l9xj2\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.399908 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51c53801-d401-4e8e-99f4-98dfc10c2fb0-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.399919 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.399929 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51c53801-d401-4e8e-99f4-98dfc10c2fb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.508642 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87dpp\" (UniqueName: \"kubernetes.io/projected/87a4d803-638a-4953-9af6-0b5214faece7-kube-api-access-87dpp\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.508782 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-config-data\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.509040 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.574075 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.614960 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.615101 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87dpp\" (UniqueName: \"kubernetes.io/projected/87a4d803-638a-4953-9af6-0b5214faece7-kube-api-access-87dpp\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.615225 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-config-data\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.621964 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.623129 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-config-data\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.634263 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87dpp\" (UniqueName: \"kubernetes.io/projected/87a4d803-638a-4953-9af6-0b5214faece7-kube-api-access-87dpp\") pod \"nova-scheduler-0\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.704787 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.737351 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.751026 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.788858 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e18be3-327d-42e9-a81a-39f9573b0a82" path="/var/lib/kubelet/pods/28e18be3-327d-42e9-a81a-39f9573b0a82/volumes" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.789773 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51c53801-d401-4e8e-99f4-98dfc10c2fb0" path="/var/lib/kubelet/pods/51c53801-d401-4e8e-99f4-98dfc10c2fb0/volumes" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.790342 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae16249-4e2f-4325-9c4e-27062b7bb7a2" path="/var/lib/kubelet/pods/fae16249-4e2f-4325-9c4e-27062b7bb7a2/volumes" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.794870 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.796583 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.796686 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.799321 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.837377 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.837742 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da386d69-5db4-40f9-85ef-502141b7b6f2-logs\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.837935 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-config-data\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.838044 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7hfz\" (UniqueName: \"kubernetes.io/projected/da386d69-5db4-40f9-85ef-502141b7b6f2-kube-api-access-x7hfz\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.939525 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-config-data\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.939642 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7hfz\" (UniqueName: \"kubernetes.io/projected/da386d69-5db4-40f9-85ef-502141b7b6f2-kube-api-access-x7hfz\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.939724 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.939753 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da386d69-5db4-40f9-85ef-502141b7b6f2-logs\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.940421 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da386d69-5db4-40f9-85ef-502141b7b6f2-logs\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.945867 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.946403 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-config-data\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:06 crc kubenswrapper[4882]: I1002 16:40:06.956478 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7hfz\" (UniqueName: \"kubernetes.io/projected/da386d69-5db4-40f9-85ef-502141b7b6f2-kube-api-access-x7hfz\") pod \"nova-api-0\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " pod="openstack/nova-api-0" Oct 02 16:40:07 crc kubenswrapper[4882]: I1002 16:40:07.149824 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:07 crc kubenswrapper[4882]: I1002 16:40:07.201014 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:07 crc kubenswrapper[4882]: W1002 16:40:07.202647 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87a4d803_638a_4953_9af6_0b5214faece7.slice/crio-42d9dae7a8368c387e22e144d680c4f7b16d2905e698054ad617369ccdd84a6a WatchSource:0}: Error finding container 42d9dae7a8368c387e22e144d680c4f7b16d2905e698054ad617369ccdd84a6a: Status 404 returned error can't find the container with id 42d9dae7a8368c387e22e144d680c4f7b16d2905e698054ad617369ccdd84a6a Oct 02 16:40:07 crc kubenswrapper[4882]: I1002 16:40:07.297075 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87a4d803-638a-4953-9af6-0b5214faece7","Type":"ContainerStarted","Data":"42d9dae7a8368c387e22e144d680c4f7b16d2905e698054ad617369ccdd84a6a"} Oct 02 16:40:07 crc kubenswrapper[4882]: I1002 16:40:07.307277 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerStarted","Data":"16ae0f5e0e5cb284b86fba306b33afbfdde8ef0c26850966325cdf87d5183db3"} Oct 02 16:40:07 crc kubenswrapper[4882]: I1002 16:40:07.685256 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:08 crc kubenswrapper[4882]: I1002 16:40:08.344556 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerStarted","Data":"f18bc44f398b818cae734f1f83be077ed55ce14117cea74ba6a97d7ea9d8405f"} Oct 02 16:40:08 crc kubenswrapper[4882]: I1002 16:40:08.348045 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da386d69-5db4-40f9-85ef-502141b7b6f2","Type":"ContainerStarted","Data":"9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f"} Oct 02 16:40:08 crc kubenswrapper[4882]: I1002 16:40:08.348082 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da386d69-5db4-40f9-85ef-502141b7b6f2","Type":"ContainerStarted","Data":"cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831"} Oct 02 16:40:08 crc kubenswrapper[4882]: I1002 16:40:08.348092 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da386d69-5db4-40f9-85ef-502141b7b6f2","Type":"ContainerStarted","Data":"212dd65bc0eb033ff74870c7d16e7244b93df691c5c08807b3451f582d31c505"} Oct 02 16:40:08 crc kubenswrapper[4882]: I1002 16:40:08.355153 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87a4d803-638a-4953-9af6-0b5214faece7","Type":"ContainerStarted","Data":"e224cf765658a9d38193f466e570d0a94987194efa3f1b60bb77d3ab823c7874"} Oct 02 16:40:08 crc kubenswrapper[4882]: I1002 16:40:08.368430 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.368415164 podStartE2EDuration="2.368415164s" podCreationTimestamp="2025-10-02 16:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:08.365561522 +0000 UTC m=+1367.114791049" watchObservedRunningTime="2025-10-02 16:40:08.368415164 +0000 UTC m=+1367.117644691" Oct 02 16:40:08 crc kubenswrapper[4882]: I1002 16:40:08.385813 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.385793117 podStartE2EDuration="2.385793117s" podCreationTimestamp="2025-10-02 16:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:08.379074449 +0000 UTC m=+1367.128303986" watchObservedRunningTime="2025-10-02 16:40:08.385793117 +0000 UTC m=+1367.135022644" Oct 02 16:40:09 crc kubenswrapper[4882]: I1002 16:40:09.367295 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerStarted","Data":"971e0cc7e7a10d06bc20bd61c9ddf0b2f3c74e76f780d7fb2f730d17c79fc78a"} Oct 02 16:40:09 crc kubenswrapper[4882]: I1002 16:40:09.367774 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerStarted","Data":"f223b199ee37c2e537119dfed1e5fe2848304b5f0b28b777629e0bdfdc6572e4"} Oct 02 16:40:09 crc kubenswrapper[4882]: I1002 16:40:09.390050 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:40:09 crc kubenswrapper[4882]: I1002 16:40:09.390111 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:40:11 crc kubenswrapper[4882]: I1002 16:40:11.705430 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 16:40:12 crc kubenswrapper[4882]: I1002 16:40:12.408842 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerStarted","Data":"3d705bc01872a8328111212c60559567e006b902406cd7f05a88dd039de7374a"} Oct 02 16:40:12 crc kubenswrapper[4882]: I1002 16:40:12.409564 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 16:40:12 crc kubenswrapper[4882]: I1002 16:40:12.439369 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.887570595 podStartE2EDuration="7.439350495s" podCreationTimestamp="2025-10-02 16:40:05 +0000 UTC" firstStartedPulling="2025-10-02 16:40:06.590902934 +0000 UTC m=+1365.340132481" lastFinishedPulling="2025-10-02 16:40:11.142682854 +0000 UTC m=+1369.891912381" observedRunningTime="2025-10-02 16:40:12.438812792 +0000 UTC m=+1371.188042329" watchObservedRunningTime="2025-10-02 16:40:12.439350495 +0000 UTC m=+1371.188580022" Oct 02 16:40:13 crc kubenswrapper[4882]: I1002 16:40:13.587654 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 16:40:14 crc kubenswrapper[4882]: I1002 16:40:14.636802 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 16:40:16 crc kubenswrapper[4882]: I1002 16:40:16.705001 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 16:40:16 crc kubenswrapper[4882]: I1002 16:40:16.731269 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 16:40:17 crc kubenswrapper[4882]: I1002 16:40:17.150904 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 16:40:17 crc kubenswrapper[4882]: I1002 16:40:17.150950 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 16:40:17 crc kubenswrapper[4882]: I1002 16:40:17.491733 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 16:40:18 crc kubenswrapper[4882]: I1002 16:40:18.234409 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 16:40:18 crc kubenswrapper[4882]: I1002 16:40:18.234411 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.467621 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.475265 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.511552 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-config-data\") pod \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.511618 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87efe738-d03e-496d-bdb4-fada78382621-logs\") pod \"87efe738-d03e-496d-bdb4-fada78382621\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.511646 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-combined-ca-bundle\") pod \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.511685 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfjmk\" (UniqueName: \"kubernetes.io/projected/ab19f927-457c-4ddf-a46d-fff6f76d24a0-kube-api-access-xfjmk\") pod \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\" (UID: \"ab19f927-457c-4ddf-a46d-fff6f76d24a0\") " Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.511747 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2knk\" (UniqueName: \"kubernetes.io/projected/87efe738-d03e-496d-bdb4-fada78382621-kube-api-access-b2knk\") pod \"87efe738-d03e-496d-bdb4-fada78382621\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.511789 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-config-data\") pod \"87efe738-d03e-496d-bdb4-fada78382621\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.511851 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-combined-ca-bundle\") pod \"87efe738-d03e-496d-bdb4-fada78382621\" (UID: \"87efe738-d03e-496d-bdb4-fada78382621\") " Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.521752 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87efe738-d03e-496d-bdb4-fada78382621-logs" (OuterVolumeSpecName: "logs") pod "87efe738-d03e-496d-bdb4-fada78382621" (UID: "87efe738-d03e-496d-bdb4-fada78382621"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.526434 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab19f927-457c-4ddf-a46d-fff6f76d24a0-kube-api-access-xfjmk" (OuterVolumeSpecName: "kube-api-access-xfjmk") pod "ab19f927-457c-4ddf-a46d-fff6f76d24a0" (UID: "ab19f927-457c-4ddf-a46d-fff6f76d24a0"). InnerVolumeSpecName "kube-api-access-xfjmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.526964 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87efe738-d03e-496d-bdb4-fada78382621-kube-api-access-b2knk" (OuterVolumeSpecName: "kube-api-access-b2knk") pod "87efe738-d03e-496d-bdb4-fada78382621" (UID: "87efe738-d03e-496d-bdb4-fada78382621"). InnerVolumeSpecName "kube-api-access-b2knk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.540001 4882 generic.go:334] "Generic (PLEG): container finished" podID="ab19f927-457c-4ddf-a46d-fff6f76d24a0" containerID="eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28" exitCode=137 Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.540039 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.540091 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ab19f927-457c-4ddf-a46d-fff6f76d24a0","Type":"ContainerDied","Data":"eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28"} Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.540127 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ab19f927-457c-4ddf-a46d-fff6f76d24a0","Type":"ContainerDied","Data":"563380e48337606dc620ca37ba963d613d3426e733cede682e01401ecf01aa54"} Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.540146 4882 scope.go:117] "RemoveContainer" containerID="eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.543067 4882 generic.go:334] "Generic (PLEG): container finished" podID="87efe738-d03e-496d-bdb4-fada78382621" containerID="67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e" exitCode=137 Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.543099 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87efe738-d03e-496d-bdb4-fada78382621","Type":"ContainerDied","Data":"67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e"} Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.543111 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.543121 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87efe738-d03e-496d-bdb4-fada78382621","Type":"ContainerDied","Data":"ac50e1efd595554d1b69767643f1fd473dd40b95529fe15ffb2a801cdf4258e0"} Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.551926 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87efe738-d03e-496d-bdb4-fada78382621" (UID: "87efe738-d03e-496d-bdb4-fada78382621"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.554064 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab19f927-457c-4ddf-a46d-fff6f76d24a0" (UID: "ab19f927-457c-4ddf-a46d-fff6f76d24a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.559344 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-config-data" (OuterVolumeSpecName: "config-data") pod "ab19f927-457c-4ddf-a46d-fff6f76d24a0" (UID: "ab19f927-457c-4ddf-a46d-fff6f76d24a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.571556 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-config-data" (OuterVolumeSpecName: "config-data") pod "87efe738-d03e-496d-bdb4-fada78382621" (UID: "87efe738-d03e-496d-bdb4-fada78382621"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.587130 4882 scope.go:117] "RemoveContainer" containerID="eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28" Oct 02 16:40:24 crc kubenswrapper[4882]: E1002 16:40:24.587682 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28\": container with ID starting with eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28 not found: ID does not exist" containerID="eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.587734 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28"} err="failed to get container status \"eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28\": rpc error: code = NotFound desc = could not find container \"eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28\": container with ID starting with eca47d59524a555a954992bcb4857f6d20854c542405da1c301e67f995579d28 not found: ID does not exist" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.587766 4882 scope.go:117] "RemoveContainer" containerID="67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.606386 4882 scope.go:117] "RemoveContainer" containerID="de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.616977 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.617025 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.617041 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87efe738-d03e-496d-bdb4-fada78382621-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.617053 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab19f927-457c-4ddf-a46d-fff6f76d24a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.617066 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfjmk\" (UniqueName: \"kubernetes.io/projected/ab19f927-457c-4ddf-a46d-fff6f76d24a0-kube-api-access-xfjmk\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.617082 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2knk\" (UniqueName: \"kubernetes.io/projected/87efe738-d03e-496d-bdb4-fada78382621-kube-api-access-b2knk\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.617096 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87efe738-d03e-496d-bdb4-fada78382621-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.625833 4882 scope.go:117] "RemoveContainer" containerID="67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e" Oct 02 16:40:24 crc kubenswrapper[4882]: E1002 16:40:24.626231 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e\": container with ID starting with 67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e not found: ID does not exist" containerID="67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.626282 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e"} err="failed to get container status \"67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e\": rpc error: code = NotFound desc = could not find container \"67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e\": container with ID starting with 67731b907ad53fedd0b59002cab00e15e61902723abdda22125a0964ce31ae3e not found: ID does not exist" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.626323 4882 scope.go:117] "RemoveContainer" containerID="de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0" Oct 02 16:40:24 crc kubenswrapper[4882]: E1002 16:40:24.626708 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0\": container with ID starting with de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0 not found: ID does not exist" containerID="de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.626746 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0"} err="failed to get container status \"de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0\": rpc error: code = NotFound desc = could not find container \"de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0\": container with ID starting with de738b36c978eedac96d4310393c9aea481ecce362bdbf8689a44bd974d767f0 not found: ID does not exist" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.871377 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.891873 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.900239 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.908038 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.916774 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:40:24 crc kubenswrapper[4882]: E1002 16:40:24.917298 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87efe738-d03e-496d-bdb4-fada78382621" containerName="nova-metadata-log" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.917322 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="87efe738-d03e-496d-bdb4-fada78382621" containerName="nova-metadata-log" Oct 02 16:40:24 crc kubenswrapper[4882]: E1002 16:40:24.917360 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87efe738-d03e-496d-bdb4-fada78382621" containerName="nova-metadata-metadata" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.917376 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="87efe738-d03e-496d-bdb4-fada78382621" containerName="nova-metadata-metadata" Oct 02 16:40:24 crc kubenswrapper[4882]: E1002 16:40:24.917399 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab19f927-457c-4ddf-a46d-fff6f76d24a0" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.917405 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab19f927-457c-4ddf-a46d-fff6f76d24a0" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.917602 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="87efe738-d03e-496d-bdb4-fada78382621" containerName="nova-metadata-log" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.917623 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab19f927-457c-4ddf-a46d-fff6f76d24a0" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.917647 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="87efe738-d03e-496d-bdb4-fada78382621" containerName="nova-metadata-metadata" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.918371 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.922589 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.922861 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.923050 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.925167 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.927033 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.930338 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.930896 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.935600 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:40:24 crc kubenswrapper[4882]: I1002 16:40:24.946389 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.022948 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023227 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023311 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-logs\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023485 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2lv\" (UniqueName: \"kubernetes.io/projected/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-kube-api-access-xn2lv\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023554 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023615 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24pph\" (UniqueName: \"kubernetes.io/projected/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-kube-api-access-24pph\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023655 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023689 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-config-data\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023762 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.023818 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.125666 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.125722 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-logs\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.125823 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2lv\" (UniqueName: \"kubernetes.io/projected/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-kube-api-access-xn2lv\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.125864 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.125892 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24pph\" (UniqueName: \"kubernetes.io/projected/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-kube-api-access-24pph\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.125922 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.125953 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-config-data\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.125982 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.126006 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.126042 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.126669 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-logs\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.130739 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.130991 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.131657 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.132389 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.132798 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.134694 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-config-data\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.135341 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.149043 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24pph\" (UniqueName: \"kubernetes.io/projected/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-kube-api-access-24pph\") pod \"nova-cell1-novncproxy-0\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.150034 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2lv\" (UniqueName: \"kubernetes.io/projected/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-kube-api-access-xn2lv\") pod \"nova-metadata-0\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.236750 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.247534 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.560891 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:25 crc kubenswrapper[4882]: I1002 16:40:25.813839 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:40:25 crc kubenswrapper[4882]: W1002 16:40:25.822174 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d5a965f_ea7b_44a8_9fc4_6cd88f67bdfc.slice/crio-9f626802537d9b5bb34017058c5ae701d3df300f8cd131dcaf378d19d943f743 WatchSource:0}: Error finding container 9f626802537d9b5bb34017058c5ae701d3df300f8cd131dcaf378d19d943f743: Status 404 returned error can't find the container with id 9f626802537d9b5bb34017058c5ae701d3df300f8cd131dcaf378d19d943f743 Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.598578 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d219b6a-6fab-42c1-a03c-b2afd52cd86c","Type":"ContainerStarted","Data":"5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26"} Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.598663 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d219b6a-6fab-42c1-a03c-b2afd52cd86c","Type":"ContainerStarted","Data":"adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc"} Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.598731 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d219b6a-6fab-42c1-a03c-b2afd52cd86c","Type":"ContainerStarted","Data":"7afaa2372dbacb9f4d11727ef4f04c553643cc300cd810587ca156e1c3c20db2"} Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.605199 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc","Type":"ContainerStarted","Data":"59d76121f684bf647e82fb7939a01b0eef06443bf9081dd3691a4619a445253e"} Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.605282 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc","Type":"ContainerStarted","Data":"9f626802537d9b5bb34017058c5ae701d3df300f8cd131dcaf378d19d943f743"} Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.625527 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.625506476 podStartE2EDuration="2.625506476s" podCreationTimestamp="2025-10-02 16:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:26.6168211 +0000 UTC m=+1385.366050627" watchObservedRunningTime="2025-10-02 16:40:26.625506476 +0000 UTC m=+1385.374736013" Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.650116 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.650094029 podStartE2EDuration="2.650094029s" podCreationTimestamp="2025-10-02 16:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:26.647842893 +0000 UTC m=+1385.397072430" watchObservedRunningTime="2025-10-02 16:40:26.650094029 +0000 UTC m=+1385.399323556" Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.770621 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87efe738-d03e-496d-bdb4-fada78382621" path="/var/lib/kubelet/pods/87efe738-d03e-496d-bdb4-fada78382621/volumes" Oct 02 16:40:26 crc kubenswrapper[4882]: I1002 16:40:26.771329 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab19f927-457c-4ddf-a46d-fff6f76d24a0" path="/var/lib/kubelet/pods/ab19f927-457c-4ddf-a46d-fff6f76d24a0/volumes" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.155307 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.155796 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.156057 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.156104 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.158292 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.159452 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.366627 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d749c565-wv426"] Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.368522 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.386230 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d749c565-wv426"] Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.469814 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-sb\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.470158 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-config\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.470257 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtkft\" (UniqueName: \"kubernetes.io/projected/65f4b488-20e6-4007-b1b4-891b06b16276-kube-api-access-xtkft\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.470290 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-svc\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.470323 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-swift-storage-0\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.470353 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-nb\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.572272 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-sb\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.572390 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-config\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.572493 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtkft\" (UniqueName: \"kubernetes.io/projected/65f4b488-20e6-4007-b1b4-891b06b16276-kube-api-access-xtkft\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.572534 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-svc\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.572583 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-swift-storage-0\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.572621 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-nb\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.573843 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-config\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.573843 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-sb\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.573931 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-swift-storage-0\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.573959 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-svc\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.574006 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-nb\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.604391 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtkft\" (UniqueName: \"kubernetes.io/projected/65f4b488-20e6-4007-b1b4-891b06b16276-kube-api-access-xtkft\") pod \"dnsmasq-dns-7d749c565-wv426\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:27 crc kubenswrapper[4882]: I1002 16:40:27.710980 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:28 crc kubenswrapper[4882]: I1002 16:40:28.210519 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d749c565-wv426"] Oct 02 16:40:28 crc kubenswrapper[4882]: W1002 16:40:28.218062 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f4b488_20e6_4007_b1b4_891b06b16276.slice/crio-d085527c20f1844eb1887caf8efbb33859e7d86fba7c8d43ee22201e205a8e86 WatchSource:0}: Error finding container d085527c20f1844eb1887caf8efbb33859e7d86fba7c8d43ee22201e205a8e86: Status 404 returned error can't find the container with id d085527c20f1844eb1887caf8efbb33859e7d86fba7c8d43ee22201e205a8e86 Oct 02 16:40:28 crc kubenswrapper[4882]: I1002 16:40:28.635202 4882 generic.go:334] "Generic (PLEG): container finished" podID="65f4b488-20e6-4007-b1b4-891b06b16276" containerID="51e94341fed25c5ed8e828868b22a1e16ae34b4b130c9a7bed213fad084185d0" exitCode=0 Oct 02 16:40:28 crc kubenswrapper[4882]: I1002 16:40:28.635303 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d749c565-wv426" event={"ID":"65f4b488-20e6-4007-b1b4-891b06b16276","Type":"ContainerDied","Data":"51e94341fed25c5ed8e828868b22a1e16ae34b4b130c9a7bed213fad084185d0"} Oct 02 16:40:28 crc kubenswrapper[4882]: I1002 16:40:28.635638 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d749c565-wv426" event={"ID":"65f4b488-20e6-4007-b1b4-891b06b16276","Type":"ContainerStarted","Data":"d085527c20f1844eb1887caf8efbb33859e7d86fba7c8d43ee22201e205a8e86"} Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.239489 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.240042 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="ceilometer-central-agent" containerID="cri-o://f18bc44f398b818cae734f1f83be077ed55ce14117cea74ba6a97d7ea9d8405f" gracePeriod=30 Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.240411 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="proxy-httpd" containerID="cri-o://3d705bc01872a8328111212c60559567e006b902406cd7f05a88dd039de7374a" gracePeriod=30 Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.240526 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="sg-core" containerID="cri-o://f223b199ee37c2e537119dfed1e5fe2848304b5f0b28b777629e0bdfdc6572e4" gracePeriod=30 Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.240578 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="ceilometer-notification-agent" containerID="cri-o://971e0cc7e7a10d06bc20bd61c9ddf0b2f3c74e76f780d7fb2f730d17c79fc78a" gracePeriod=30 Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.251816 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.193:3000/\": read tcp 10.217.0.2:44942->10.217.0.193:3000: read: connection reset by peer" Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.648815 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d749c565-wv426" event={"ID":"65f4b488-20e6-4007-b1b4-891b06b16276","Type":"ContainerStarted","Data":"bcbf61e67fe3d94f3a2827ff0936322bdb1ccefdfe19185199aaf29bd9db4ce7"} Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.648979 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.654460 4882 generic.go:334] "Generic (PLEG): container finished" podID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerID="3d705bc01872a8328111212c60559567e006b902406cd7f05a88dd039de7374a" exitCode=0 Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.654492 4882 generic.go:334] "Generic (PLEG): container finished" podID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerID="f223b199ee37c2e537119dfed1e5fe2848304b5f0b28b777629e0bdfdc6572e4" exitCode=2 Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.654513 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerDied","Data":"3d705bc01872a8328111212c60559567e006b902406cd7f05a88dd039de7374a"} Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.654537 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerDied","Data":"f223b199ee37c2e537119dfed1e5fe2848304b5f0b28b777629e0bdfdc6572e4"} Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.675089 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d749c565-wv426" podStartSLOduration=2.675065814 podStartE2EDuration="2.675065814s" podCreationTimestamp="2025-10-02 16:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:29.667235368 +0000 UTC m=+1388.416464895" watchObservedRunningTime="2025-10-02 16:40:29.675065814 +0000 UTC m=+1388.424295341" Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.865027 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.865395 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-log" containerID="cri-o://cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831" gracePeriod=30 Oct 02 16:40:29 crc kubenswrapper[4882]: I1002 16:40:29.865444 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-api" containerID="cri-o://9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f" gracePeriod=30 Oct 02 16:40:30 crc kubenswrapper[4882]: I1002 16:40:30.237733 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:30 crc kubenswrapper[4882]: I1002 16:40:30.248580 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 16:40:30 crc kubenswrapper[4882]: I1002 16:40:30.248654 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 16:40:30 crc kubenswrapper[4882]: I1002 16:40:30.665005 4882 generic.go:334] "Generic (PLEG): container finished" podID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerID="cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831" exitCode=143 Oct 02 16:40:30 crc kubenswrapper[4882]: I1002 16:40:30.665060 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da386d69-5db4-40f9-85ef-502141b7b6f2","Type":"ContainerDied","Data":"cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831"} Oct 02 16:40:30 crc kubenswrapper[4882]: I1002 16:40:30.668927 4882 generic.go:334] "Generic (PLEG): container finished" podID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerID="f18bc44f398b818cae734f1f83be077ed55ce14117cea74ba6a97d7ea9d8405f" exitCode=0 Oct 02 16:40:30 crc kubenswrapper[4882]: I1002 16:40:30.668993 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerDied","Data":"f18bc44f398b818cae734f1f83be077ed55ce14117cea74ba6a97d7ea9d8405f"} Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.431929 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.594011 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da386d69-5db4-40f9-85ef-502141b7b6f2-logs\") pod \"da386d69-5db4-40f9-85ef-502141b7b6f2\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.594264 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7hfz\" (UniqueName: \"kubernetes.io/projected/da386d69-5db4-40f9-85ef-502141b7b6f2-kube-api-access-x7hfz\") pod \"da386d69-5db4-40f9-85ef-502141b7b6f2\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.594398 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-combined-ca-bundle\") pod \"da386d69-5db4-40f9-85ef-502141b7b6f2\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.594507 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-config-data\") pod \"da386d69-5db4-40f9-85ef-502141b7b6f2\" (UID: \"da386d69-5db4-40f9-85ef-502141b7b6f2\") " Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.594567 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da386d69-5db4-40f9-85ef-502141b7b6f2-logs" (OuterVolumeSpecName: "logs") pod "da386d69-5db4-40f9-85ef-502141b7b6f2" (UID: "da386d69-5db4-40f9-85ef-502141b7b6f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.595915 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da386d69-5db4-40f9-85ef-502141b7b6f2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.600119 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da386d69-5db4-40f9-85ef-502141b7b6f2-kube-api-access-x7hfz" (OuterVolumeSpecName: "kube-api-access-x7hfz") pod "da386d69-5db4-40f9-85ef-502141b7b6f2" (UID: "da386d69-5db4-40f9-85ef-502141b7b6f2"). InnerVolumeSpecName "kube-api-access-x7hfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.641836 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-config-data" (OuterVolumeSpecName: "config-data") pod "da386d69-5db4-40f9-85ef-502141b7b6f2" (UID: "da386d69-5db4-40f9-85ef-502141b7b6f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.651382 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da386d69-5db4-40f9-85ef-502141b7b6f2" (UID: "da386d69-5db4-40f9-85ef-502141b7b6f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.697170 4882 generic.go:334] "Generic (PLEG): container finished" podID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerID="9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f" exitCode=0 Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.697226 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da386d69-5db4-40f9-85ef-502141b7b6f2","Type":"ContainerDied","Data":"9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f"} Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.697255 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"da386d69-5db4-40f9-85ef-502141b7b6f2","Type":"ContainerDied","Data":"212dd65bc0eb033ff74870c7d16e7244b93df691c5c08807b3451f582d31c505"} Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.697271 4882 scope.go:117] "RemoveContainer" containerID="9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.697388 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.697824 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.698511 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da386d69-5db4-40f9-85ef-502141b7b6f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.698542 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7hfz\" (UniqueName: \"kubernetes.io/projected/da386d69-5db4-40f9-85ef-502141b7b6f2-kube-api-access-x7hfz\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.738791 4882 scope.go:117] "RemoveContainer" containerID="cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.742371 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.771320 4882 scope.go:117] "RemoveContainer" containerID="9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f" Oct 02 16:40:33 crc kubenswrapper[4882]: E1002 16:40:33.774531 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f\": container with ID starting with 9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f not found: ID does not exist" containerID="9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.774599 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f"} err="failed to get container status \"9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f\": rpc error: code = NotFound desc = could not find container \"9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f\": container with ID starting with 9feca2fb7a818f03fad7be7bb567d37dc1da16f599ad38b283f201009972f13f not found: ID does not exist" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.774633 4882 scope.go:117] "RemoveContainer" containerID="cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831" Oct 02 16:40:33 crc kubenswrapper[4882]: E1002 16:40:33.775196 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831\": container with ID starting with cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831 not found: ID does not exist" containerID="cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.775232 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831"} err="failed to get container status \"cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831\": rpc error: code = NotFound desc = could not find container \"cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831\": container with ID starting with cf5d243b48df41dbe0b2c2c84ec13467c3871373acbc3ff18e974998e2225831 not found: ID does not exist" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.784995 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.794318 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:33 crc kubenswrapper[4882]: E1002 16:40:33.795623 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-log" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.795650 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-log" Oct 02 16:40:33 crc kubenswrapper[4882]: E1002 16:40:33.795686 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-api" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.795692 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-api" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.796040 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-api" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.796080 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" containerName="nova-api-log" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.797932 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.806160 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.806470 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.806622 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.815957 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.908235 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-config-data\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.908297 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-public-tls-certs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.908335 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.908437 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd0a177-52c7-414a-96b0-050782fb4b92-logs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.908467 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms4qb\" (UniqueName: \"kubernetes.io/projected/6cd0a177-52c7-414a-96b0-050782fb4b92-kube-api-access-ms4qb\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:33 crc kubenswrapper[4882]: I1002 16:40:33.908522 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.010470 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.010562 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-config-data\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.010612 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-public-tls-certs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.010660 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.010761 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd0a177-52c7-414a-96b0-050782fb4b92-logs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.010804 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms4qb\" (UniqueName: \"kubernetes.io/projected/6cd0a177-52c7-414a-96b0-050782fb4b92-kube-api-access-ms4qb\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.011334 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd0a177-52c7-414a-96b0-050782fb4b92-logs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.014882 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.015572 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-public-tls-certs\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.015664 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.017980 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-config-data\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.031344 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms4qb\" (UniqueName: \"kubernetes.io/projected/6cd0a177-52c7-414a-96b0-050782fb4b92-kube-api-access-ms4qb\") pod \"nova-api-0\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.126826 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.605679 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.709543 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cd0a177-52c7-414a-96b0-050782fb4b92","Type":"ContainerStarted","Data":"a7c43102768ddaa5242c596799d155432d73a7e1f591d5dfc345f3dbbf54230c"} Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.712748 4882 generic.go:334] "Generic (PLEG): container finished" podID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerID="971e0cc7e7a10d06bc20bd61c9ddf0b2f3c74e76f780d7fb2f730d17c79fc78a" exitCode=0 Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.712940 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerDied","Data":"971e0cc7e7a10d06bc20bd61c9ddf0b2f3c74e76f780d7fb2f730d17c79fc78a"} Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.778967 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da386d69-5db4-40f9-85ef-502141b7b6f2" path="/var/lib/kubelet/pods/da386d69-5db4-40f9-85ef-502141b7b6f2/volumes" Oct 02 16:40:34 crc kubenswrapper[4882]: I1002 16:40:34.884884 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.027568 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-run-httpd\") pod \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.027621 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-scripts\") pod \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.027677 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-ceilometer-tls-certs\") pod \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.027761 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-combined-ca-bundle\") pod \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.027809 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65pqc\" (UniqueName: \"kubernetes.io/projected/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-kube-api-access-65pqc\") pod \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.027887 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-config-data\") pod \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.027922 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-log-httpd\") pod \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.028007 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-sg-core-conf-yaml\") pod \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\" (UID: \"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c\") " Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.028163 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" (UID: "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.028670 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" (UID: "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.028693 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.032503 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-kube-api-access-65pqc" (OuterVolumeSpecName: "kube-api-access-65pqc") pod "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" (UID: "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c"). InnerVolumeSpecName "kube-api-access-65pqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.034299 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-scripts" (OuterVolumeSpecName: "scripts") pod "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" (UID: "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.057260 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" (UID: "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.080685 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" (UID: "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.104588 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" (UID: "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.130153 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.130467 4882 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.130479 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.130487 4882 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.130498 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.130510 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65pqc\" (UniqueName: \"kubernetes.io/projected/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-kube-api-access-65pqc\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.142909 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-config-data" (OuterVolumeSpecName: "config-data") pod "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" (UID: "0d3c4c7e-1605-466e-a2a7-7f70565f9f9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.231907 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.237483 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.248713 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.248942 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.263487 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.727431 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d3c4c7e-1605-466e-a2a7-7f70565f9f9c","Type":"ContainerDied","Data":"16ae0f5e0e5cb284b86fba306b33afbfdde8ef0c26850966325cdf87d5183db3"} Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.728846 4882 scope.go:117] "RemoveContainer" containerID="3d705bc01872a8328111212c60559567e006b902406cd7f05a88dd039de7374a" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.727713 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.731772 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cd0a177-52c7-414a-96b0-050782fb4b92","Type":"ContainerStarted","Data":"2b150cf2eb2efb3a53ced75685b33b145de98cd7a2558f2d67e2d3decf8c710e"} Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.731811 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cd0a177-52c7-414a-96b0-050782fb4b92","Type":"ContainerStarted","Data":"2d1fbc0fa9ef2885603c09446d1bdf71ba66c8bc842604c61e72020bc8c3e763"} Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.754943 4882 scope.go:117] "RemoveContainer" containerID="f223b199ee37c2e537119dfed1e5fe2848304b5f0b28b777629e0bdfdc6572e4" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.759383 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.760939 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.760919429 podStartE2EDuration="2.760919429s" podCreationTimestamp="2025-10-02 16:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:35.757426311 +0000 UTC m=+1394.506655828" watchObservedRunningTime="2025-10-02 16:40:35.760919429 +0000 UTC m=+1394.510148956" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.787288 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.788950 4882 scope.go:117] "RemoveContainer" containerID="971e0cc7e7a10d06bc20bd61c9ddf0b2f3c74e76f780d7fb2f730d17c79fc78a" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.799321 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.817745 4882 scope.go:117] "RemoveContainer" containerID="f18bc44f398b818cae734f1f83be077ed55ce14117cea74ba6a97d7ea9d8405f" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.829680 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:35 crc kubenswrapper[4882]: E1002 16:40:35.830131 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="proxy-httpd" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.830153 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="proxy-httpd" Oct 02 16:40:35 crc kubenswrapper[4882]: E1002 16:40:35.830167 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="ceilometer-notification-agent" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.830173 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="ceilometer-notification-agent" Oct 02 16:40:35 crc kubenswrapper[4882]: E1002 16:40:35.830189 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="ceilometer-central-agent" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.830195 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="ceilometer-central-agent" Oct 02 16:40:35 crc kubenswrapper[4882]: E1002 16:40:35.830226 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="sg-core" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.830233 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="sg-core" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.830407 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="sg-core" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.830422 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="proxy-httpd" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.830439 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="ceilometer-central-agent" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.830460 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" containerName="ceilometer-notification-agent" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.832414 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.847367 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.848059 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.848359 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.890909 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.950577 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssk9g\" (UniqueName: \"kubernetes.io/projected/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-kube-api-access-ssk9g\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.950951 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-config-data\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.951007 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.951037 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-log-httpd\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.951077 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.951110 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-run-httpd\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.951128 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-scripts\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.951187 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.989317 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gdh2h"] Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.992006 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.995402 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.995755 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 16:40:35 crc kubenswrapper[4882]: I1002 16:40:35.999147 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gdh2h"] Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053424 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053508 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssk9g\" (UniqueName: \"kubernetes.io/projected/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-kube-api-access-ssk9g\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053554 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-225j2\" (UniqueName: \"kubernetes.io/projected/34084c20-a5bd-437c-8e9b-f972e89bdc34-kube-api-access-225j2\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053631 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-config-data\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053711 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053763 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-log-httpd\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053810 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-config-data\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053855 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053897 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-scripts\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053929 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-run-httpd\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053955 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-scripts\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.053980 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.055049 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-log-httpd\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.055259 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-run-httpd\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.057635 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-config-data\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.059703 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.063723 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-scripts\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.071766 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.074998 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.076641 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssk9g\" (UniqueName: \"kubernetes.io/projected/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-kube-api-access-ssk9g\") pod \"ceilometer-0\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.155702 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-config-data\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.155780 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-scripts\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.155807 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.155862 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-225j2\" (UniqueName: \"kubernetes.io/projected/34084c20-a5bd-437c-8e9b-f972e89bdc34-kube-api-access-225j2\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.159951 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.160504 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-config-data\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.161727 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-scripts\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.173874 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-225j2\" (UniqueName: \"kubernetes.io/projected/34084c20-a5bd-437c-8e9b-f972e89bdc34-kube-api-access-225j2\") pod \"nova-cell1-cell-mapping-gdh2h\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.184060 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.270106 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 16:40:36 crc kubenswrapper[4882]: I1002 16:40:36.270147 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:36.319498 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:36.774407 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3c4c7e-1605-466e-a2a7-7f70565f9f9c" path="/var/lib/kubelet/pods/0d3c4c7e-1605-466e-a2a7-7f70565f9f9c/volumes" Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:37.429485 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:37.521125 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gdh2h"] Oct 02 16:40:37 crc kubenswrapper[4882]: W1002 16:40:37.526654 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34084c20_a5bd_437c_8e9b_f972e89bdc34.slice/crio-d1a300a973b6d5e8ce0377b290273fb4dcfcb6ad221ada00fc254333d79a1d4f WatchSource:0}: Error finding container d1a300a973b6d5e8ce0377b290273fb4dcfcb6ad221ada00fc254333d79a1d4f: Status 404 returned error can't find the container with id d1a300a973b6d5e8ce0377b290273fb4dcfcb6ad221ada00fc254333d79a1d4f Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:37.713493 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:37.755554 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerStarted","Data":"ebdbe5ad9cdb46e7cd1007f3d75872a47746b63bf654406a4ff0504331a5d4ad"} Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:37.757130 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gdh2h" event={"ID":"34084c20-a5bd-437c-8e9b-f972e89bdc34","Type":"ContainerStarted","Data":"d1a300a973b6d5e8ce0377b290273fb4dcfcb6ad221ada00fc254333d79a1d4f"} Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:37.787525 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb949cd99-xmhnq"] Oct 02 16:40:37 crc kubenswrapper[4882]: I1002 16:40:37.787830 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" podUID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" containerName="dnsmasq-dns" containerID="cri-o://64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473" gracePeriod=10 Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.342264 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.409634 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km6ln\" (UniqueName: \"kubernetes.io/projected/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-kube-api-access-km6ln\") pod \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.409830 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-svc\") pod \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.409859 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-config\") pod \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.409925 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-swift-storage-0\") pod \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.410028 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-sb\") pod \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.410990 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-nb\") pod \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\" (UID: \"ac797c1e-927e-42d4-81f5-6cb74bd13e8f\") " Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.442484 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-kube-api-access-km6ln" (OuterVolumeSpecName: "kube-api-access-km6ln") pod "ac797c1e-927e-42d4-81f5-6cb74bd13e8f" (UID: "ac797c1e-927e-42d4-81f5-6cb74bd13e8f"). InnerVolumeSpecName "kube-api-access-km6ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.489993 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-config" (OuterVolumeSpecName: "config") pod "ac797c1e-927e-42d4-81f5-6cb74bd13e8f" (UID: "ac797c1e-927e-42d4-81f5-6cb74bd13e8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.490719 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac797c1e-927e-42d4-81f5-6cb74bd13e8f" (UID: "ac797c1e-927e-42d4-81f5-6cb74bd13e8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.499995 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac797c1e-927e-42d4-81f5-6cb74bd13e8f" (UID: "ac797c1e-927e-42d4-81f5-6cb74bd13e8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.513977 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.514018 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.514444 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.514466 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km6ln\" (UniqueName: \"kubernetes.io/projected/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-kube-api-access-km6ln\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.519447 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac797c1e-927e-42d4-81f5-6cb74bd13e8f" (UID: "ac797c1e-927e-42d4-81f5-6cb74bd13e8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.546820 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac797c1e-927e-42d4-81f5-6cb74bd13e8f" (UID: "ac797c1e-927e-42d4-81f5-6cb74bd13e8f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.616254 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.616296 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac797c1e-927e-42d4-81f5-6cb74bd13e8f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.775826 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerStarted","Data":"1f23372a0ffce94576fa7b8b12f78b5723d92783fb82f3c3a11c09be97afc71e"} Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.775890 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerStarted","Data":"9a942b8c17e8463a8e6f76d1513de147a7fbccbc3c6bce75a888d236140df363"} Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.780041 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gdh2h" event={"ID":"34084c20-a5bd-437c-8e9b-f972e89bdc34","Type":"ContainerStarted","Data":"bcb72f41583b2dc39fc26211b1de727dd1fbe88709748d3f9c2182967e02454f"} Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.785713 4882 generic.go:334] "Generic (PLEG): container finished" podID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" containerID="64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473" exitCode=0 Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.785765 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.785769 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" event={"ID":"ac797c1e-927e-42d4-81f5-6cb74bd13e8f","Type":"ContainerDied","Data":"64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473"} Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.785862 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb949cd99-xmhnq" event={"ID":"ac797c1e-927e-42d4-81f5-6cb74bd13e8f","Type":"ContainerDied","Data":"31ae64c7f7c079bb905a1b98236bfaaf54dd5704ee9c21ffff896d3e2675c5c0"} Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.785953 4882 scope.go:117] "RemoveContainer" containerID="64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.800839 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gdh2h" podStartSLOduration=3.800815646 podStartE2EDuration="3.800815646s" podCreationTimestamp="2025-10-02 16:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:38.797069092 +0000 UTC m=+1397.546298629" watchObservedRunningTime="2025-10-02 16:40:38.800815646 +0000 UTC m=+1397.550045173" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.812297 4882 scope.go:117] "RemoveContainer" containerID="1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.840611 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb949cd99-xmhnq"] Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.853740 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb949cd99-xmhnq"] Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.862374 4882 scope.go:117] "RemoveContainer" containerID="64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473" Oct 02 16:40:38 crc kubenswrapper[4882]: E1002 16:40:38.863048 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473\": container with ID starting with 64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473 not found: ID does not exist" containerID="64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.863099 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473"} err="failed to get container status \"64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473\": rpc error: code = NotFound desc = could not find container \"64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473\": container with ID starting with 64bf99b6c105d54d24a9686888da5cd62bab1473822a0fbe0c969d559bf9b473 not found: ID does not exist" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.863129 4882 scope.go:117] "RemoveContainer" containerID="1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33" Oct 02 16:40:38 crc kubenswrapper[4882]: E1002 16:40:38.863667 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33\": container with ID starting with 1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33 not found: ID does not exist" containerID="1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33" Oct 02 16:40:38 crc kubenswrapper[4882]: I1002 16:40:38.863735 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33"} err="failed to get container status \"1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33\": rpc error: code = NotFound desc = could not find container \"1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33\": container with ID starting with 1c2d943f7cedcae8cb4b990605e77312d4767cb04f5a23ecff9461640e57fb33 not found: ID does not exist" Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.401516 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.401835 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.401887 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.402949 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4817b7e232fd2cfe282905e3863c3f81d1a0be19ec05b6f8eef5289d492e445b"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.404454 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://4817b7e232fd2cfe282905e3863c3f81d1a0be19ec05b6f8eef5289d492e445b" gracePeriod=600 Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.801706 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerStarted","Data":"513b374180132a0f297498712d1e51739fc2ad0c26615a542f982ac395df9d3e"} Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.805733 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="4817b7e232fd2cfe282905e3863c3f81d1a0be19ec05b6f8eef5289d492e445b" exitCode=0 Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.805754 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"4817b7e232fd2cfe282905e3863c3f81d1a0be19ec05b6f8eef5289d492e445b"} Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.805870 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599"} Oct 02 16:40:39 crc kubenswrapper[4882]: I1002 16:40:39.805904 4882 scope.go:117] "RemoveContainer" containerID="e31cec89ec1abb79b55918e38d3f35660f646818c4983c4b5f2f16b7f0dee66d" Oct 02 16:40:40 crc kubenswrapper[4882]: I1002 16:40:40.776533 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" path="/var/lib/kubelet/pods/ac797c1e-927e-42d4-81f5-6cb74bd13e8f/volumes" Oct 02 16:40:41 crc kubenswrapper[4882]: I1002 16:40:41.838665 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerStarted","Data":"94742d7620ee6e9af50f0bdad8f628d680e7d044141c67f87aac9ef2bfa64a8b"} Oct 02 16:40:41 crc kubenswrapper[4882]: I1002 16:40:41.840285 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 16:40:41 crc kubenswrapper[4882]: I1002 16:40:41.869560 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.723629522 podStartE2EDuration="6.8695359s" podCreationTimestamp="2025-10-02 16:40:35 +0000 UTC" firstStartedPulling="2025-10-02 16:40:37.427242208 +0000 UTC m=+1396.176471745" lastFinishedPulling="2025-10-02 16:40:40.573148596 +0000 UTC m=+1399.322378123" observedRunningTime="2025-10-02 16:40:41.860056634 +0000 UTC m=+1400.609286161" watchObservedRunningTime="2025-10-02 16:40:41.8695359 +0000 UTC m=+1400.618765427" Oct 02 16:40:42 crc kubenswrapper[4882]: I1002 16:40:42.856103 4882 generic.go:334] "Generic (PLEG): container finished" podID="34084c20-a5bd-437c-8e9b-f972e89bdc34" containerID="bcb72f41583b2dc39fc26211b1de727dd1fbe88709748d3f9c2182967e02454f" exitCode=0 Oct 02 16:40:42 crc kubenswrapper[4882]: I1002 16:40:42.857823 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gdh2h" event={"ID":"34084c20-a5bd-437c-8e9b-f972e89bdc34","Type":"ContainerDied","Data":"bcb72f41583b2dc39fc26211b1de727dd1fbe88709748d3f9c2182967e02454f"} Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.126956 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.127282 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.244752 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.351911 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-config-data\") pod \"34084c20-a5bd-437c-8e9b-f972e89bdc34\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.352029 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-scripts\") pod \"34084c20-a5bd-437c-8e9b-f972e89bdc34\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.352088 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-combined-ca-bundle\") pod \"34084c20-a5bd-437c-8e9b-f972e89bdc34\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.352141 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-225j2\" (UniqueName: \"kubernetes.io/projected/34084c20-a5bd-437c-8e9b-f972e89bdc34-kube-api-access-225j2\") pod \"34084c20-a5bd-437c-8e9b-f972e89bdc34\" (UID: \"34084c20-a5bd-437c-8e9b-f972e89bdc34\") " Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.358283 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-scripts" (OuterVolumeSpecName: "scripts") pod "34084c20-a5bd-437c-8e9b-f972e89bdc34" (UID: "34084c20-a5bd-437c-8e9b-f972e89bdc34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.361358 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34084c20-a5bd-437c-8e9b-f972e89bdc34-kube-api-access-225j2" (OuterVolumeSpecName: "kube-api-access-225j2") pod "34084c20-a5bd-437c-8e9b-f972e89bdc34" (UID: "34084c20-a5bd-437c-8e9b-f972e89bdc34"). InnerVolumeSpecName "kube-api-access-225j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.383141 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-config-data" (OuterVolumeSpecName: "config-data") pod "34084c20-a5bd-437c-8e9b-f972e89bdc34" (UID: "34084c20-a5bd-437c-8e9b-f972e89bdc34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.386456 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34084c20-a5bd-437c-8e9b-f972e89bdc34" (UID: "34084c20-a5bd-437c-8e9b-f972e89bdc34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.454683 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.454907 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.454987 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34084c20-a5bd-437c-8e9b-f972e89bdc34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.455063 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-225j2\" (UniqueName: \"kubernetes.io/projected/34084c20-a5bd-437c-8e9b-f972e89bdc34-kube-api-access-225j2\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.883315 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gdh2h" event={"ID":"34084c20-a5bd-437c-8e9b-f972e89bdc34","Type":"ContainerDied","Data":"d1a300a973b6d5e8ce0377b290273fb4dcfcb6ad221ada00fc254333d79a1d4f"} Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.883379 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a300a973b6d5e8ce0377b290273fb4dcfcb6ad221ada00fc254333d79a1d4f" Oct 02 16:40:44 crc kubenswrapper[4882]: I1002 16:40:44.883416 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gdh2h" Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.067534 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.067783 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-log" containerID="cri-o://2d1fbc0fa9ef2885603c09446d1bdf71ba66c8bc842604c61e72020bc8c3e763" gracePeriod=30 Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.067964 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-api" containerID="cri-o://2b150cf2eb2efb3a53ced75685b33b145de98cd7a2558f2d67e2d3decf8c710e" gracePeriod=30 Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.092928 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": EOF" Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.093243 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": EOF" Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.169323 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.169923 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="87a4d803-638a-4953-9af6-0b5214faece7" containerName="nova-scheduler-scheduler" containerID="cri-o://e224cf765658a9d38193f466e570d0a94987194efa3f1b60bb77d3ab823c7874" gracePeriod=30 Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.191845 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.192340 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-log" containerID="cri-o://adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc" gracePeriod=30 Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.192906 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-metadata" containerID="cri-o://5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26" gracePeriod=30 Oct 02 16:40:45 crc kubenswrapper[4882]: E1002 16:40:45.418269 4882 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cd0a177_52c7_414a_96b0_050782fb4b92.slice/crio-conmon-2d1fbc0fa9ef2885603c09446d1bdf71ba66c8bc842604c61e72020bc8c3e763.scope\": RecentStats: unable to find data in memory cache]" Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.895010 4882 generic.go:334] "Generic (PLEG): container finished" podID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerID="adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc" exitCode=143 Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.895078 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d219b6a-6fab-42c1-a03c-b2afd52cd86c","Type":"ContainerDied","Data":"adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc"} Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.896891 4882 generic.go:334] "Generic (PLEG): container finished" podID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerID="2d1fbc0fa9ef2885603c09446d1bdf71ba66c8bc842604c61e72020bc8c3e763" exitCode=143 Oct 02 16:40:45 crc kubenswrapper[4882]: I1002 16:40:45.896942 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cd0a177-52c7-414a-96b0-050782fb4b92","Type":"ContainerDied","Data":"2d1fbc0fa9ef2885603c09446d1bdf71ba66c8bc842604c61e72020bc8c3e763"} Oct 02 16:40:46 crc kubenswrapper[4882]: E1002 16:40:46.706706 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e224cf765658a9d38193f466e570d0a94987194efa3f1b60bb77d3ab823c7874" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:40:46 crc kubenswrapper[4882]: E1002 16:40:46.708334 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e224cf765658a9d38193f466e570d0a94987194efa3f1b60bb77d3ab823c7874" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:40:46 crc kubenswrapper[4882]: E1002 16:40:46.709395 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e224cf765658a9d38193f466e570d0a94987194efa3f1b60bb77d3ab823c7874" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:40:46 crc kubenswrapper[4882]: E1002 16:40:46.709443 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="87a4d803-638a-4953-9af6-0b5214faece7" containerName="nova-scheduler-scheduler" Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.908533 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.931712 4882 generic.go:334] "Generic (PLEG): container finished" podID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerID="5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26" exitCode=0 Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.931780 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d219b6a-6fab-42c1-a03c-b2afd52cd86c","Type":"ContainerDied","Data":"5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26"} Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.931794 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.931819 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d219b6a-6fab-42c1-a03c-b2afd52cd86c","Type":"ContainerDied","Data":"7afaa2372dbacb9f4d11727ef4f04c553643cc300cd810587ca156e1c3c20db2"} Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.931865 4882 scope.go:117] "RemoveContainer" containerID="5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26" Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.946748 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-config-data\") pod \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.946820 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-nova-metadata-tls-certs\") pod \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.946900 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn2lv\" (UniqueName: \"kubernetes.io/projected/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-kube-api-access-xn2lv\") pod \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.947007 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-combined-ca-bundle\") pod \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.947040 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-logs\") pod \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\" (UID: \"4d219b6a-6fab-42c1-a03c-b2afd52cd86c\") " Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.948306 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-logs" (OuterVolumeSpecName: "logs") pod "4d219b6a-6fab-42c1-a03c-b2afd52cd86c" (UID: "4d219b6a-6fab-42c1-a03c-b2afd52cd86c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.959518 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-kube-api-access-xn2lv" (OuterVolumeSpecName: "kube-api-access-xn2lv") pod "4d219b6a-6fab-42c1-a03c-b2afd52cd86c" (UID: "4d219b6a-6fab-42c1-a03c-b2afd52cd86c"). InnerVolumeSpecName "kube-api-access-xn2lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.963467 4882 scope.go:117] "RemoveContainer" containerID="adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc" Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.988577 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-config-data" (OuterVolumeSpecName: "config-data") pod "4d219b6a-6fab-42c1-a03c-b2afd52cd86c" (UID: "4d219b6a-6fab-42c1-a03c-b2afd52cd86c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:48 crc kubenswrapper[4882]: I1002 16:40:48.991254 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d219b6a-6fab-42c1-a03c-b2afd52cd86c" (UID: "4d219b6a-6fab-42c1-a03c-b2afd52cd86c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.012506 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4d219b6a-6fab-42c1-a03c-b2afd52cd86c" (UID: "4d219b6a-6fab-42c1-a03c-b2afd52cd86c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.050236 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.050282 4882 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.050299 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn2lv\" (UniqueName: \"kubernetes.io/projected/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-kube-api-access-xn2lv\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.050312 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.050325 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d219b6a-6fab-42c1-a03c-b2afd52cd86c-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.057640 4882 scope.go:117] "RemoveContainer" containerID="5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26" Oct 02 16:40:49 crc kubenswrapper[4882]: E1002 16:40:49.058073 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26\": container with ID starting with 5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26 not found: ID does not exist" containerID="5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.058116 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26"} err="failed to get container status \"5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26\": rpc error: code = NotFound desc = could not find container \"5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26\": container with ID starting with 5b9fb0f5b69887f155456b0142804f005e85f7a81ceb255c19be5c3981d9ed26 not found: ID does not exist" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.058144 4882 scope.go:117] "RemoveContainer" containerID="adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc" Oct 02 16:40:49 crc kubenswrapper[4882]: E1002 16:40:49.058839 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc\": container with ID starting with adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc not found: ID does not exist" containerID="adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.058884 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc"} err="failed to get container status \"adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc\": rpc error: code = NotFound desc = could not find container \"adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc\": container with ID starting with adc636fb9af4505c026c3e8ccd4ddf810c8d9c0c322fc7c2e6de51e86f0db0dc not found: ID does not exist" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.263672 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.273234 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.284882 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:49 crc kubenswrapper[4882]: E1002 16:40:49.285396 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34084c20-a5bd-437c-8e9b-f972e89bdc34" containerName="nova-manage" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285422 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="34084c20-a5bd-437c-8e9b-f972e89bdc34" containerName="nova-manage" Oct 02 16:40:49 crc kubenswrapper[4882]: E1002 16:40:49.285457 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" containerName="init" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285466 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" containerName="init" Oct 02 16:40:49 crc kubenswrapper[4882]: E1002 16:40:49.285483 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-metadata" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285491 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-metadata" Oct 02 16:40:49 crc kubenswrapper[4882]: E1002 16:40:49.285507 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-log" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285514 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-log" Oct 02 16:40:49 crc kubenswrapper[4882]: E1002 16:40:49.285527 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" containerName="dnsmasq-dns" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285534 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" containerName="dnsmasq-dns" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285752 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="34084c20-a5bd-437c-8e9b-f972e89bdc34" containerName="nova-manage" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285779 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac797c1e-927e-42d4-81f5-6cb74bd13e8f" containerName="dnsmasq-dns" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285802 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-log" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.285816 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" containerName="nova-metadata-metadata" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.287043 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.293826 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.294068 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.295721 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.354634 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hlh\" (UniqueName: \"kubernetes.io/projected/b5b37b99-d806-4dce-a73e-653f8ebc5567-kube-api-access-z7hlh\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.354697 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-config-data\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.354782 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b37b99-d806-4dce-a73e-653f8ebc5567-logs\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.355061 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.355566 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.457258 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.457479 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hlh\" (UniqueName: \"kubernetes.io/projected/b5b37b99-d806-4dce-a73e-653f8ebc5567-kube-api-access-z7hlh\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.457565 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-config-data\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.457626 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b37b99-d806-4dce-a73e-653f8ebc5567-logs\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.457714 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.458658 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b37b99-d806-4dce-a73e-653f8ebc5567-logs\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.461769 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.462762 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-config-data\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.462912 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.476875 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hlh\" (UniqueName: \"kubernetes.io/projected/b5b37b99-d806-4dce-a73e-653f8ebc5567-kube-api-access-z7hlh\") pod \"nova-metadata-0\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " pod="openstack/nova-metadata-0" Oct 02 16:40:49 crc kubenswrapper[4882]: I1002 16:40:49.630419 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.109340 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:40:50 crc kubenswrapper[4882]: W1002 16:40:50.122335 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b37b99_d806_4dce_a73e_653f8ebc5567.slice/crio-43d253676362a72567b43a395ecee3faabb34fa32441161c49efe82223b1abb9 WatchSource:0}: Error finding container 43d253676362a72567b43a395ecee3faabb34fa32441161c49efe82223b1abb9: Status 404 returned error can't find the container with id 43d253676362a72567b43a395ecee3faabb34fa32441161c49efe82223b1abb9 Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.352810 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jbqcc"] Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.355105 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.373258 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jbqcc"] Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.484417 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-utilities\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.484508 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wl2r\" (UniqueName: \"kubernetes.io/projected/84e3a095-a12c-4fa5-944a-c84dd3734fcc-kube-api-access-6wl2r\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.484662 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-catalog-content\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.586848 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-catalog-content\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.587039 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-utilities\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.587795 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-catalog-content\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.587925 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-utilities\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.588025 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wl2r\" (UniqueName: \"kubernetes.io/projected/84e3a095-a12c-4fa5-944a-c84dd3734fcc-kube-api-access-6wl2r\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.609395 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wl2r\" (UniqueName: \"kubernetes.io/projected/84e3a095-a12c-4fa5-944a-c84dd3734fcc-kube-api-access-6wl2r\") pod \"redhat-operators-jbqcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.685395 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.781567 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d219b6a-6fab-42c1-a03c-b2afd52cd86c" path="/var/lib/kubelet/pods/4d219b6a-6fab-42c1-a03c-b2afd52cd86c/volumes" Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.993994 4882 generic.go:334] "Generic (PLEG): container finished" podID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerID="2b150cf2eb2efb3a53ced75685b33b145de98cd7a2558f2d67e2d3decf8c710e" exitCode=0 Oct 02 16:40:50 crc kubenswrapper[4882]: I1002 16:40:50.994752 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cd0a177-52c7-414a-96b0-050782fb4b92","Type":"ContainerDied","Data":"2b150cf2eb2efb3a53ced75685b33b145de98cd7a2558f2d67e2d3decf8c710e"} Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.006847 4882 generic.go:334] "Generic (PLEG): container finished" podID="87a4d803-638a-4953-9af6-0b5214faece7" containerID="e224cf765658a9d38193f466e570d0a94987194efa3f1b60bb77d3ab823c7874" exitCode=0 Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.006959 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87a4d803-638a-4953-9af6-0b5214faece7","Type":"ContainerDied","Data":"e224cf765658a9d38193f466e570d0a94987194efa3f1b60bb77d3ab823c7874"} Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.007005 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87a4d803-638a-4953-9af6-0b5214faece7","Type":"ContainerDied","Data":"42d9dae7a8368c387e22e144d680c4f7b16d2905e698054ad617369ccdd84a6a"} Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.007020 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42d9dae7a8368c387e22e144d680c4f7b16d2905e698054ad617369ccdd84a6a" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.012236 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.012767 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5b37b99-d806-4dce-a73e-653f8ebc5567","Type":"ContainerStarted","Data":"2d2f7b7472a6b146044b414ca3cbab36ecb87a0c178a2b350803295c0549c80d"} Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.012814 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5b37b99-d806-4dce-a73e-653f8ebc5567","Type":"ContainerStarted","Data":"e2ff269e176bc20d80aa95fcdf36998283cbc95b8397ab30cf540ffd4bf2d588"} Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.012824 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5b37b99-d806-4dce-a73e-653f8ebc5567","Type":"ContainerStarted","Data":"43d253676362a72567b43a395ecee3faabb34fa32441161c49efe82223b1abb9"} Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.057189 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.057167293 podStartE2EDuration="2.057167293s" podCreationTimestamp="2025-10-02 16:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:51.053009599 +0000 UTC m=+1409.802239126" watchObservedRunningTime="2025-10-02 16:40:51.057167293 +0000 UTC m=+1409.806396820" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.093540 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.097301 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-combined-ca-bundle\") pod \"87a4d803-638a-4953-9af6-0b5214faece7\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.097427 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-config-data\") pod \"87a4d803-638a-4953-9af6-0b5214faece7\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.097615 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87dpp\" (UniqueName: \"kubernetes.io/projected/87a4d803-638a-4953-9af6-0b5214faece7-kube-api-access-87dpp\") pod \"87a4d803-638a-4953-9af6-0b5214faece7\" (UID: \"87a4d803-638a-4953-9af6-0b5214faece7\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.106042 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a4d803-638a-4953-9af6-0b5214faece7-kube-api-access-87dpp" (OuterVolumeSpecName: "kube-api-access-87dpp") pod "87a4d803-638a-4953-9af6-0b5214faece7" (UID: "87a4d803-638a-4953-9af6-0b5214faece7"). InnerVolumeSpecName "kube-api-access-87dpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.155631 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-config-data" (OuterVolumeSpecName: "config-data") pod "87a4d803-638a-4953-9af6-0b5214faece7" (UID: "87a4d803-638a-4953-9af6-0b5214faece7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.173137 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87a4d803-638a-4953-9af6-0b5214faece7" (UID: "87a4d803-638a-4953-9af6-0b5214faece7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.200718 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd0a177-52c7-414a-96b0-050782fb4b92-logs\") pod \"6cd0a177-52c7-414a-96b0-050782fb4b92\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.200768 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms4qb\" (UniqueName: \"kubernetes.io/projected/6cd0a177-52c7-414a-96b0-050782fb4b92-kube-api-access-ms4qb\") pod \"6cd0a177-52c7-414a-96b0-050782fb4b92\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.200915 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-combined-ca-bundle\") pod \"6cd0a177-52c7-414a-96b0-050782fb4b92\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.200964 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-public-tls-certs\") pod \"6cd0a177-52c7-414a-96b0-050782fb4b92\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.201018 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-config-data\") pod \"6cd0a177-52c7-414a-96b0-050782fb4b92\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.201082 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-internal-tls-certs\") pod \"6cd0a177-52c7-414a-96b0-050782fb4b92\" (UID: \"6cd0a177-52c7-414a-96b0-050782fb4b92\") " Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.201232 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd0a177-52c7-414a-96b0-050782fb4b92-logs" (OuterVolumeSpecName: "logs") pod "6cd0a177-52c7-414a-96b0-050782fb4b92" (UID: "6cd0a177-52c7-414a-96b0-050782fb4b92"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.201492 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.201510 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd0a177-52c7-414a-96b0-050782fb4b92-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.201519 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87dpp\" (UniqueName: \"kubernetes.io/projected/87a4d803-638a-4953-9af6-0b5214faece7-kube-api-access-87dpp\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.201528 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a4d803-638a-4953-9af6-0b5214faece7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.208328 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd0a177-52c7-414a-96b0-050782fb4b92-kube-api-access-ms4qb" (OuterVolumeSpecName: "kube-api-access-ms4qb") pod "6cd0a177-52c7-414a-96b0-050782fb4b92" (UID: "6cd0a177-52c7-414a-96b0-050782fb4b92"). InnerVolumeSpecName "kube-api-access-ms4qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.230536 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-config-data" (OuterVolumeSpecName: "config-data") pod "6cd0a177-52c7-414a-96b0-050782fb4b92" (UID: "6cd0a177-52c7-414a-96b0-050782fb4b92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.230562 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cd0a177-52c7-414a-96b0-050782fb4b92" (UID: "6cd0a177-52c7-414a-96b0-050782fb4b92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.265660 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6cd0a177-52c7-414a-96b0-050782fb4b92" (UID: "6cd0a177-52c7-414a-96b0-050782fb4b92"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.266803 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jbqcc"] Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.274767 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6cd0a177-52c7-414a-96b0-050782fb4b92" (UID: "6cd0a177-52c7-414a-96b0-050782fb4b92"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.303322 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.303365 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.303380 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms4qb\" (UniqueName: \"kubernetes.io/projected/6cd0a177-52c7-414a-96b0-050782fb4b92-kube-api-access-ms4qb\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.303390 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:51 crc kubenswrapper[4882]: I1002 16:40:51.303399 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd0a177-52c7-414a-96b0-050782fb4b92-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.023897 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6cd0a177-52c7-414a-96b0-050782fb4b92","Type":"ContainerDied","Data":"a7c43102768ddaa5242c596799d155432d73a7e1f591d5dfc345f3dbbf54230c"} Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.024489 4882 scope.go:117] "RemoveContainer" containerID="2b150cf2eb2efb3a53ced75685b33b145de98cd7a2558f2d67e2d3decf8c710e" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.023954 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.026083 4882 generic.go:334] "Generic (PLEG): container finished" podID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerID="606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8" exitCode=0 Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.026172 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.026251 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbqcc" event={"ID":"84e3a095-a12c-4fa5-944a-c84dd3734fcc","Type":"ContainerDied","Data":"606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8"} Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.026361 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbqcc" event={"ID":"84e3a095-a12c-4fa5-944a-c84dd3734fcc","Type":"ContainerStarted","Data":"fe91da1f4bbd7c4591840beb8234d317beae2e03a5bdc4f55a5dd644008c7fab"} Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.062908 4882 scope.go:117] "RemoveContainer" containerID="2d1fbc0fa9ef2885603c09446d1bdf71ba66c8bc842604c61e72020bc8c3e763" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.089442 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.100363 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.105077 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.115163 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.126050 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: E1002 16:40:52.126567 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a4d803-638a-4953-9af6-0b5214faece7" containerName="nova-scheduler-scheduler" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.126592 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a4d803-638a-4953-9af6-0b5214faece7" containerName="nova-scheduler-scheduler" Oct 02 16:40:52 crc kubenswrapper[4882]: E1002 16:40:52.126612 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-log" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.126621 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-log" Oct 02 16:40:52 crc kubenswrapper[4882]: E1002 16:40:52.126674 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-api" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.126682 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-api" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.126894 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-log" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.126917 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a4d803-638a-4953-9af6-0b5214faece7" containerName="nova-scheduler-scheduler" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.126934 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" containerName="nova-api-api" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.141421 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.141817 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.142648 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.142960 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.146272 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.146706 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.146840 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.148010 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.149225 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.218791 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-config-data\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.218860 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqxm\" (UniqueName: \"kubernetes.io/projected/06e85b8e-fa65-4016-adbb-e72100f18388-kube-api-access-cpqxm\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.218892 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.218958 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdnfp\" (UniqueName: \"kubernetes.io/projected/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-kube-api-access-tdnfp\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.218996 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.219260 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-public-tls-certs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.219467 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-config-data\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.219498 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.219558 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-logs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322188 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-config-data\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322305 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqxm\" (UniqueName: \"kubernetes.io/projected/06e85b8e-fa65-4016-adbb-e72100f18388-kube-api-access-cpqxm\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322328 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322377 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdnfp\" (UniqueName: \"kubernetes.io/projected/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-kube-api-access-tdnfp\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322409 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322438 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-public-tls-certs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322487 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-config-data\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322503 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322525 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-logs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.322921 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-logs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.328204 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.329136 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-config-data\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.330060 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-public-tls-certs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.330461 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.331265 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-config-data\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.331200 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.342250 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqxm\" (UniqueName: \"kubernetes.io/projected/06e85b8e-fa65-4016-adbb-e72100f18388-kube-api-access-cpqxm\") pod \"nova-scheduler-0\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.342433 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdnfp\" (UniqueName: \"kubernetes.io/projected/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-kube-api-access-tdnfp\") pod \"nova-api-0\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.471542 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.483848 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.790495 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd0a177-52c7-414a-96b0-050782fb4b92" path="/var/lib/kubelet/pods/6cd0a177-52c7-414a-96b0-050782fb4b92/volumes" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.791663 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a4d803-638a-4953-9af6-0b5214faece7" path="/var/lib/kubelet/pods/87a4d803-638a-4953-9af6-0b5214faece7/volumes" Oct 02 16:40:52 crc kubenswrapper[4882]: I1002 16:40:52.939827 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:40:52 crc kubenswrapper[4882]: W1002 16:40:52.941420 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e85b8e_fa65_4016_adbb_e72100f18388.slice/crio-14358c4b1a4ae01495b201e9e95620eb2076d624c76e8b67d55e6d525c3f8b16 WatchSource:0}: Error finding container 14358c4b1a4ae01495b201e9e95620eb2076d624c76e8b67d55e6d525c3f8b16: Status 404 returned error can't find the container with id 14358c4b1a4ae01495b201e9e95620eb2076d624c76e8b67d55e6d525c3f8b16 Oct 02 16:40:53 crc kubenswrapper[4882]: I1002 16:40:53.049112 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:40:53 crc kubenswrapper[4882]: I1002 16:40:53.066107 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06e85b8e-fa65-4016-adbb-e72100f18388","Type":"ContainerStarted","Data":"14358c4b1a4ae01495b201e9e95620eb2076d624c76e8b67d55e6d525c3f8b16"} Oct 02 16:40:53 crc kubenswrapper[4882]: I1002 16:40:53.069971 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbqcc" event={"ID":"84e3a095-a12c-4fa5-944a-c84dd3734fcc","Type":"ContainerStarted","Data":"9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49"} Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.097080 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06e85b8e-fa65-4016-adbb-e72100f18388","Type":"ContainerStarted","Data":"01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5"} Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.099570 4882 generic.go:334] "Generic (PLEG): container finished" podID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerID="9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49" exitCode=0 Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.099630 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbqcc" event={"ID":"84e3a095-a12c-4fa5-944a-c84dd3734fcc","Type":"ContainerDied","Data":"9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49"} Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.105162 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f","Type":"ContainerStarted","Data":"c0cade657e8d743e4938ba6a036b7b7b533715caf8cc87468d9eea57d5a85617"} Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.105237 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f","Type":"ContainerStarted","Data":"63647f011c653d8f4af95c4fa30a2a264c7bf4c99639ef5a76ed2e9bca13fa93"} Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.105253 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f","Type":"ContainerStarted","Data":"f550241b15c95fcc625121707c03baa4fef5139fcf8590c21af01985fe802e52"} Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.126317 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.126296188 podStartE2EDuration="2.126296188s" podCreationTimestamp="2025-10-02 16:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:54.119887459 +0000 UTC m=+1412.869116996" watchObservedRunningTime="2025-10-02 16:40:54.126296188 +0000 UTC m=+1412.875525735" Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.170614 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.170586082 podStartE2EDuration="2.170586082s" podCreationTimestamp="2025-10-02 16:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:40:54.143037945 +0000 UTC m=+1412.892267512" watchObservedRunningTime="2025-10-02 16:40:54.170586082 +0000 UTC m=+1412.919815619" Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.630710 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 16:40:54 crc kubenswrapper[4882]: I1002 16:40:54.630778 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 16:40:55 crc kubenswrapper[4882]: I1002 16:40:55.118926 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbqcc" event={"ID":"84e3a095-a12c-4fa5-944a-c84dd3734fcc","Type":"ContainerStarted","Data":"1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe"} Oct 02 16:40:55 crc kubenswrapper[4882]: I1002 16:40:55.138579 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jbqcc" podStartSLOduration=2.640449873 podStartE2EDuration="5.138556224s" podCreationTimestamp="2025-10-02 16:40:50 +0000 UTC" firstStartedPulling="2025-10-02 16:40:52.028819258 +0000 UTC m=+1410.778048775" lastFinishedPulling="2025-10-02 16:40:54.526925599 +0000 UTC m=+1413.276155126" observedRunningTime="2025-10-02 16:40:55.138183915 +0000 UTC m=+1413.887413442" watchObservedRunningTime="2025-10-02 16:40:55.138556224 +0000 UTC m=+1413.887785751" Oct 02 16:40:57 crc kubenswrapper[4882]: I1002 16:40:57.484261 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 16:40:59 crc kubenswrapper[4882]: I1002 16:40:59.630816 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 16:40:59 crc kubenswrapper[4882]: I1002 16:40:59.631549 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 16:41:00 crc kubenswrapper[4882]: I1002 16:41:00.647396 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 16:41:00 crc kubenswrapper[4882]: I1002 16:41:00.647629 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 16:41:00 crc kubenswrapper[4882]: I1002 16:41:00.686440 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:41:00 crc kubenswrapper[4882]: I1002 16:41:00.686502 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:41:00 crc kubenswrapper[4882]: I1002 16:41:00.735300 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:41:01 crc kubenswrapper[4882]: I1002 16:41:01.226356 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:41:01 crc kubenswrapper[4882]: I1002 16:41:01.274673 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jbqcc"] Oct 02 16:41:02 crc kubenswrapper[4882]: I1002 16:41:02.472139 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 16:41:02 crc kubenswrapper[4882]: I1002 16:41:02.472198 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 16:41:02 crc kubenswrapper[4882]: I1002 16:41:02.485069 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 16:41:02 crc kubenswrapper[4882]: I1002 16:41:02.526324 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.199715 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jbqcc" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerName="registry-server" containerID="cri-o://1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe" gracePeriod=2 Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.242631 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.487803 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.487918 4882 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.666542 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.764421 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-catalog-content\") pod \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.764510 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-utilities\") pod \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.764541 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wl2r\" (UniqueName: \"kubernetes.io/projected/84e3a095-a12c-4fa5-944a-c84dd3734fcc-kube-api-access-6wl2r\") pod \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\" (UID: \"84e3a095-a12c-4fa5-944a-c84dd3734fcc\") " Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.765126 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-utilities" (OuterVolumeSpecName: "utilities") pod "84e3a095-a12c-4fa5-944a-c84dd3734fcc" (UID: "84e3a095-a12c-4fa5-944a-c84dd3734fcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.784461 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e3a095-a12c-4fa5-944a-c84dd3734fcc-kube-api-access-6wl2r" (OuterVolumeSpecName: "kube-api-access-6wl2r") pod "84e3a095-a12c-4fa5-944a-c84dd3734fcc" (UID: "84e3a095-a12c-4fa5-944a-c84dd3734fcc"). InnerVolumeSpecName "kube-api-access-6wl2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.829051 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84e3a095-a12c-4fa5-944a-c84dd3734fcc" (UID: "84e3a095-a12c-4fa5-944a-c84dd3734fcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.867161 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.867194 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e3a095-a12c-4fa5-944a-c84dd3734fcc-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:03 crc kubenswrapper[4882]: I1002 16:41:03.867220 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wl2r\" (UniqueName: \"kubernetes.io/projected/84e3a095-a12c-4fa5-944a-c84dd3734fcc-kube-api-access-6wl2r\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.217064 4882 generic.go:334] "Generic (PLEG): container finished" podID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerID="1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe" exitCode=0 Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.217167 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jbqcc" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.217245 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbqcc" event={"ID":"84e3a095-a12c-4fa5-944a-c84dd3734fcc","Type":"ContainerDied","Data":"1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe"} Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.217282 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jbqcc" event={"ID":"84e3a095-a12c-4fa5-944a-c84dd3734fcc","Type":"ContainerDied","Data":"fe91da1f4bbd7c4591840beb8234d317beae2e03a5bdc4f55a5dd644008c7fab"} Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.217299 4882 scope.go:117] "RemoveContainer" containerID="1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.249627 4882 scope.go:117] "RemoveContainer" containerID="9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.252395 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jbqcc"] Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.262888 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jbqcc"] Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.275690 4882 scope.go:117] "RemoveContainer" containerID="606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.326362 4882 scope.go:117] "RemoveContainer" containerID="1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe" Oct 02 16:41:04 crc kubenswrapper[4882]: E1002 16:41:04.326785 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe\": container with ID starting with 1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe not found: ID does not exist" containerID="1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.326829 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe"} err="failed to get container status \"1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe\": rpc error: code = NotFound desc = could not find container \"1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe\": container with ID starting with 1e26f6804a08168ce20972c3d4628a7aa04d89c96bf3d33684f9511e3f422afe not found: ID does not exist" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.326858 4882 scope.go:117] "RemoveContainer" containerID="9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49" Oct 02 16:41:04 crc kubenswrapper[4882]: E1002 16:41:04.327137 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49\": container with ID starting with 9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49 not found: ID does not exist" containerID="9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.327168 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49"} err="failed to get container status \"9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49\": rpc error: code = NotFound desc = could not find container \"9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49\": container with ID starting with 9c7e10a8613b5fca6d7eae78bd46490198c20f883f57e6a82c914c3cb796eb49 not found: ID does not exist" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.327191 4882 scope.go:117] "RemoveContainer" containerID="606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8" Oct 02 16:41:04 crc kubenswrapper[4882]: E1002 16:41:04.327477 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8\": container with ID starting with 606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8 not found: ID does not exist" containerID="606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.327524 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8"} err="failed to get container status \"606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8\": rpc error: code = NotFound desc = could not find container \"606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8\": container with ID starting with 606c7428051b4fd7829017a968cfb50a8eab0fae1b16c65d12bd2a04e1a29dd8 not found: ID does not exist" Oct 02 16:41:04 crc kubenswrapper[4882]: I1002 16:41:04.771671 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" path="/var/lib/kubelet/pods/84e3a095-a12c-4fa5-944a-c84dd3734fcc/volumes" Oct 02 16:41:06 crc kubenswrapper[4882]: I1002 16:41:06.207910 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 16:41:09 crc kubenswrapper[4882]: I1002 16:41:09.639483 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 16:41:09 crc kubenswrapper[4882]: I1002 16:41:09.640286 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 16:41:09 crc kubenswrapper[4882]: I1002 16:41:09.646473 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 16:41:09 crc kubenswrapper[4882]: I1002 16:41:09.653721 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.321347 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5gsk8"] Oct 02 16:41:12 crc kubenswrapper[4882]: E1002 16:41:12.322620 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerName="extract-utilities" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.322648 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerName="extract-utilities" Oct 02 16:41:12 crc kubenswrapper[4882]: E1002 16:41:12.322694 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerName="registry-server" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.322709 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerName="registry-server" Oct 02 16:41:12 crc kubenswrapper[4882]: E1002 16:41:12.322755 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerName="extract-content" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.322769 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerName="extract-content" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.323121 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e3a095-a12c-4fa5-944a-c84dd3734fcc" containerName="registry-server" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.325604 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.345895 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5gsk8"] Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.465574 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-utilities\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.465702 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-catalog-content\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.466018 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nszbz\" (UniqueName: \"kubernetes.io/projected/34cad1d7-7476-462c-a25e-43f315d12b5d-kube-api-access-nszbz\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.487190 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.487836 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.489673 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.503804 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.568739 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-utilities\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.568917 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-catalog-content\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.569031 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nszbz\" (UniqueName: \"kubernetes.io/projected/34cad1d7-7476-462c-a25e-43f315d12b5d-kube-api-access-nszbz\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.570173 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-utilities\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.570265 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-catalog-content\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.594104 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nszbz\" (UniqueName: \"kubernetes.io/projected/34cad1d7-7476-462c-a25e-43f315d12b5d-kube-api-access-nszbz\") pod \"certified-operators-5gsk8\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:12 crc kubenswrapper[4882]: I1002 16:41:12.654139 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:13 crc kubenswrapper[4882]: I1002 16:41:13.169723 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5gsk8"] Oct 02 16:41:13 crc kubenswrapper[4882]: W1002 16:41:13.175664 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cad1d7_7476_462c_a25e_43f315d12b5d.slice/crio-56d93b6eecd4b038f07cad9c75af1b980bc993d68b1cfc358bcbdeb7a35b4bde WatchSource:0}: Error finding container 56d93b6eecd4b038f07cad9c75af1b980bc993d68b1cfc358bcbdeb7a35b4bde: Status 404 returned error can't find the container with id 56d93b6eecd4b038f07cad9c75af1b980bc993d68b1cfc358bcbdeb7a35b4bde Oct 02 16:41:13 crc kubenswrapper[4882]: I1002 16:41:13.350194 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gsk8" event={"ID":"34cad1d7-7476-462c-a25e-43f315d12b5d","Type":"ContainerStarted","Data":"56d93b6eecd4b038f07cad9c75af1b980bc993d68b1cfc358bcbdeb7a35b4bde"} Oct 02 16:41:13 crc kubenswrapper[4882]: I1002 16:41:13.350413 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 16:41:13 crc kubenswrapper[4882]: I1002 16:41:13.360681 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 16:41:14 crc kubenswrapper[4882]: I1002 16:41:14.363073 4882 generic.go:334] "Generic (PLEG): container finished" podID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerID="7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c" exitCode=0 Oct 02 16:41:14 crc kubenswrapper[4882]: I1002 16:41:14.363136 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gsk8" event={"ID":"34cad1d7-7476-462c-a25e-43f315d12b5d","Type":"ContainerDied","Data":"7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c"} Oct 02 16:41:15 crc kubenswrapper[4882]: I1002 16:41:15.377143 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gsk8" event={"ID":"34cad1d7-7476-462c-a25e-43f315d12b5d","Type":"ContainerStarted","Data":"5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097"} Oct 02 16:41:16 crc kubenswrapper[4882]: I1002 16:41:16.390818 4882 generic.go:334] "Generic (PLEG): container finished" podID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerID="5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097" exitCode=0 Oct 02 16:41:16 crc kubenswrapper[4882]: I1002 16:41:16.390918 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gsk8" event={"ID":"34cad1d7-7476-462c-a25e-43f315d12b5d","Type":"ContainerDied","Data":"5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097"} Oct 02 16:41:17 crc kubenswrapper[4882]: I1002 16:41:17.405429 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gsk8" event={"ID":"34cad1d7-7476-462c-a25e-43f315d12b5d","Type":"ContainerStarted","Data":"55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1"} Oct 02 16:41:17 crc kubenswrapper[4882]: I1002 16:41:17.431978 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5gsk8" podStartSLOduration=2.904466073 podStartE2EDuration="5.431952286s" podCreationTimestamp="2025-10-02 16:41:12 +0000 UTC" firstStartedPulling="2025-10-02 16:41:14.36523583 +0000 UTC m=+1433.114465357" lastFinishedPulling="2025-10-02 16:41:16.892722003 +0000 UTC m=+1435.641951570" observedRunningTime="2025-10-02 16:41:17.428419217 +0000 UTC m=+1436.177648744" watchObservedRunningTime="2025-10-02 16:41:17.431952286 +0000 UTC m=+1436.181181823" Oct 02 16:41:22 crc kubenswrapper[4882]: I1002 16:41:22.656977 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:22 crc kubenswrapper[4882]: I1002 16:41:22.657732 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:22 crc kubenswrapper[4882]: I1002 16:41:22.719115 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:23 crc kubenswrapper[4882]: I1002 16:41:23.535251 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:23 crc kubenswrapper[4882]: I1002 16:41:23.593430 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5gsk8"] Oct 02 16:41:25 crc kubenswrapper[4882]: I1002 16:41:25.510010 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5gsk8" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerName="registry-server" containerID="cri-o://55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1" gracePeriod=2 Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.028176 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.159945 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-utilities\") pod \"34cad1d7-7476-462c-a25e-43f315d12b5d\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.160420 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-catalog-content\") pod \"34cad1d7-7476-462c-a25e-43f315d12b5d\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.160460 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nszbz\" (UniqueName: \"kubernetes.io/projected/34cad1d7-7476-462c-a25e-43f315d12b5d-kube-api-access-nszbz\") pod \"34cad1d7-7476-462c-a25e-43f315d12b5d\" (UID: \"34cad1d7-7476-462c-a25e-43f315d12b5d\") " Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.161593 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-utilities" (OuterVolumeSpecName: "utilities") pod "34cad1d7-7476-462c-a25e-43f315d12b5d" (UID: "34cad1d7-7476-462c-a25e-43f315d12b5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.168876 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34cad1d7-7476-462c-a25e-43f315d12b5d-kube-api-access-nszbz" (OuterVolumeSpecName: "kube-api-access-nszbz") pod "34cad1d7-7476-462c-a25e-43f315d12b5d" (UID: "34cad1d7-7476-462c-a25e-43f315d12b5d"). InnerVolumeSpecName "kube-api-access-nszbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.216011 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34cad1d7-7476-462c-a25e-43f315d12b5d" (UID: "34cad1d7-7476-462c-a25e-43f315d12b5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.262949 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.262989 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34cad1d7-7476-462c-a25e-43f315d12b5d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.263004 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nszbz\" (UniqueName: \"kubernetes.io/projected/34cad1d7-7476-462c-a25e-43f315d12b5d-kube-api-access-nszbz\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.523648 4882 generic.go:334] "Generic (PLEG): container finished" podID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerID="55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1" exitCode=0 Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.523709 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gsk8" event={"ID":"34cad1d7-7476-462c-a25e-43f315d12b5d","Type":"ContainerDied","Data":"55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1"} Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.523745 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gsk8" event={"ID":"34cad1d7-7476-462c-a25e-43f315d12b5d","Type":"ContainerDied","Data":"56d93b6eecd4b038f07cad9c75af1b980bc993d68b1cfc358bcbdeb7a35b4bde"} Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.523770 4882 scope.go:117] "RemoveContainer" containerID="55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.523789 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gsk8" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.546411 4882 scope.go:117] "RemoveContainer" containerID="5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.581880 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5gsk8"] Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.591740 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5gsk8"] Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.608729 4882 scope.go:117] "RemoveContainer" containerID="7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.651081 4882 scope.go:117] "RemoveContainer" containerID="55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1" Oct 02 16:41:26 crc kubenswrapper[4882]: E1002 16:41:26.651620 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1\": container with ID starting with 55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1 not found: ID does not exist" containerID="55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.651676 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1"} err="failed to get container status \"55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1\": rpc error: code = NotFound desc = could not find container \"55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1\": container with ID starting with 55b59d19ea2748e6cb966ab0d29362f0d17078673646d3d64e80276aa48eb7a1 not found: ID does not exist" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.651708 4882 scope.go:117] "RemoveContainer" containerID="5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097" Oct 02 16:41:26 crc kubenswrapper[4882]: E1002 16:41:26.652509 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097\": container with ID starting with 5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097 not found: ID does not exist" containerID="5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.652546 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097"} err="failed to get container status \"5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097\": rpc error: code = NotFound desc = could not find container \"5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097\": container with ID starting with 5e3ce86d3546d929f43613c89278313a9edca2372f12084c8b326f903d936097 not found: ID does not exist" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.652561 4882 scope.go:117] "RemoveContainer" containerID="7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c" Oct 02 16:41:26 crc kubenswrapper[4882]: E1002 16:41:26.652861 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c\": container with ID starting with 7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c not found: ID does not exist" containerID="7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.652889 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c"} err="failed to get container status \"7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c\": rpc error: code = NotFound desc = could not find container \"7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c\": container with ID starting with 7244a39ea792100eb529c4652629a21eaff9f5a440acd0d76d58fe60b0b9f56c not found: ID does not exist" Oct 02 16:41:26 crc kubenswrapper[4882]: I1002 16:41:26.775916 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" path="/var/lib/kubelet/pods/34cad1d7-7476-462c-a25e-43f315d12b5d/volumes" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.082497 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hzll4"] Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.083495 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerName="registry-server" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.083511 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerName="registry-server" Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.083548 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerName="extract-content" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.083554 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerName="extract-content" Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.083573 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerName="extract-utilities" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.083581 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerName="extract-utilities" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.083745 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="34cad1d7-7476-462c-a25e-43f315d12b5d" containerName="registry-server" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.085276 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.096985 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzll4"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.138080 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.138334 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="ed84812d-565f-4ffc-a886-8cbddb32db0e" containerName="openstackclient" containerID="cri-o://3e8912cefdcd596bcc23c33fee09e9e99d8e66bf4762de8ad88089cdb917b1ac" gracePeriod=2 Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.154512 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.230770 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-catalog-content\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.230843 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-utilities\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.231725 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgsc\" (UniqueName: \"kubernetes.io/projected/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-kube-api-access-9qgsc\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.283330 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.320863 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement52af-account-delete-gwjd7"] Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.321370 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed84812d-565f-4ffc-a886-8cbddb32db0e" containerName="openstackclient" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.321392 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed84812d-565f-4ffc-a886-8cbddb32db0e" containerName="openstackclient" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.321580 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed84812d-565f-4ffc-a886-8cbddb32db0e" containerName="openstackclient" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.329094 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement52af-account-delete-gwjd7" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.333805 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-catalog-content\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.333913 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-utilities\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.333993 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgsc\" (UniqueName: \"kubernetes.io/projected/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-kube-api-access-9qgsc\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.334915 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-catalog-content\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.335252 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-utilities\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.348362 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement52af-account-delete-gwjd7"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.374033 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgsc\" (UniqueName: \"kubernetes.io/projected/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-kube-api-access-9qgsc\") pod \"redhat-marketplace-hzll4\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.402370 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.443501 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sz8t\" (UniqueName: \"kubernetes.io/projected/e0ac3371-e1ab-4d5d-a543-9b9b68a0118a-kube-api-access-4sz8t\") pod \"placement52af-account-delete-gwjd7\" (UID: \"e0ac3371-e1ab-4d5d-a543-9b9b68a0118a\") " pod="openstack/placement52af-account-delete-gwjd7" Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.443980 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.444321 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data podName:ad4f3fde-e95f-404d-baac-1c6238494afa nodeName:}" failed. No retries permitted until 2025-10-02 16:41:34.94429639 +0000 UTC m=+1453.693525917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa") : configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.474881 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v7dcx"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.510502 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-rhsqp"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.510752 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-rhsqp" podUID="92887968-fdd5-4653-a151-70e4a8f963fc" containerName="openstack-network-exporter" containerID="cri-o://6bc9f5c33140d2b4ca0bbd08d5002572bc61b1f513ed811d7b913685891c44e2" gracePeriod=30 Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.544749 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sz8t\" (UniqueName: \"kubernetes.io/projected/e0ac3371-e1ab-4d5d-a543-9b9b68a0118a-kube-api-access-4sz8t\") pod \"placement52af-account-delete-gwjd7\" (UID: \"e0ac3371-e1ab-4d5d-a543-9b9b68a0118a\") " pod="openstack/placement52af-account-delete-gwjd7" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.566422 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-wsqlw"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.666314 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sz8t\" (UniqueName: \"kubernetes.io/projected/e0ac3371-e1ab-4d5d-a543-9b9b68a0118a-kube-api-access-4sz8t\") pod \"placement52af-account-delete-gwjd7\" (UID: \"e0ac3371-e1ab-4d5d-a543-9b9b68a0118a\") " pod="openstack/placement52af-account-delete-gwjd7" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.717377 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fcqvb"] Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.916841 4882 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openstack/ovndbcluster-nb-etc-ovn-ovsdbserver-nb-0: PVC is being deleted" pod="openstack/ovsdbserver-nb-0" volumeName="ovndbcluster-nb-etc-ovn" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.934540 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cindera969-account-delete-pgcm7"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.936595 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fcqvb"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.937127 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cindera969-account-delete-pgcm7"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.937147 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.937166 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.939198 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cindera969-account-delete-pgcm7" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.941712 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-spmcq"] Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.959262 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement52af-account-delete-gwjd7" Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.962628 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="ovn-northd" containerID="cri-o://cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" gracePeriod=30 Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.962893 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="openstack-network-exporter" containerID="cri-o://6f83cafdb181c103837b92a2bd78dedec85779409ecdc413b2e067e64c5babcf" gracePeriod=30 Oct 02 16:41:34 crc kubenswrapper[4882]: I1002 16:41:34.967484 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-spmcq"] Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.972196 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:34 crc kubenswrapper[4882]: E1002 16:41:34.985466 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data podName:ad4f3fde-e95f-404d-baac-1c6238494afa nodeName:}" failed. No retries permitted until 2025-10-02 16:41:35.98541845 +0000 UTC m=+1454.734648027 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa") : configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.008084 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d749c565-wv426"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.008455 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d749c565-wv426" podUID="65f4b488-20e6-4007-b1b4-891b06b16276" containerName="dnsmasq-dns" containerID="cri-o://bcbf61e67fe3d94f3a2827ff0936322bdb1ccefdfe19185199aaf29bd9db4ce7" gracePeriod=10 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.064674 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican6e73-account-delete-49s2r"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.066408 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican6e73-account-delete-49s2r" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.079121 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lr4v\" (UniqueName: \"kubernetes.io/projected/bb7cef74-9b55-4213-a934-7c1d2c058aab-kube-api-access-4lr4v\") pod \"cindera969-account-delete-pgcm7\" (UID: \"bb7cef74-9b55-4213-a934-7c1d2c058aab\") " pod="openstack/cindera969-account-delete-pgcm7" Oct 02 16:41:35 crc kubenswrapper[4882]: E1002 16:41:35.081523 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 16:41:35 crc kubenswrapper[4882]: E1002 16:41:35.081609 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data podName:74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42 nodeName:}" failed. No retries permitted until 2025-10-02 16:41:35.581567555 +0000 UTC m=+1454.330797082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data") pod "rabbitmq-server-0" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42") : configmap "rabbitmq-config-data" not found Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.093936 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican6e73-account-delete-49s2r"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.186279 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvlz\" (UniqueName: \"kubernetes.io/projected/6b9a74a6-19b0-4ab2-a047-9ff9c13137d7-kube-api-access-4gvlz\") pod \"barbican6e73-account-delete-49s2r\" (UID: \"6b9a74a6-19b0-4ab2-a047-9ff9c13137d7\") " pod="openstack/barbican6e73-account-delete-49s2r" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.186607 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lr4v\" (UniqueName: \"kubernetes.io/projected/bb7cef74-9b55-4213-a934-7c1d2c058aab-kube-api-access-4lr4v\") pod \"cindera969-account-delete-pgcm7\" (UID: \"bb7cef74-9b55-4213-a934-7c1d2c058aab\") " pod="openstack/cindera969-account-delete-pgcm7" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.204352 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancec9ca-account-delete-csrvn"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.205608 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancec9ca-account-delete-csrvn" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.274409 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-k42n2"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.276399 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lr4v\" (UniqueName: \"kubernetes.io/projected/bb7cef74-9b55-4213-a934-7c1d2c058aab-kube-api-access-4lr4v\") pod \"cindera969-account-delete-pgcm7\" (UID: \"bb7cef74-9b55-4213-a934-7c1d2c058aab\") " pod="openstack/cindera969-account-delete-pgcm7" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.304379 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnzs2\" (UniqueName: \"kubernetes.io/projected/ce1313f5-3aed-43a9-881d-bf61353ab6bd-kube-api-access-hnzs2\") pod \"glancec9ca-account-delete-csrvn\" (UID: \"ce1313f5-3aed-43a9-881d-bf61353ab6bd\") " pod="openstack/glancec9ca-account-delete-csrvn" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.304429 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvlz\" (UniqueName: \"kubernetes.io/projected/6b9a74a6-19b0-4ab2-a047-9ff9c13137d7-kube-api-access-4gvlz\") pod \"barbican6e73-account-delete-49s2r\" (UID: \"6b9a74a6-19b0-4ab2-a047-9ff9c13137d7\") " pod="openstack/barbican6e73-account-delete-49s2r" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.317477 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancec9ca-account-delete-csrvn"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.352052 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-k42n2"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.369688 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cindera969-account-delete-pgcm7" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.388975 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvlz\" (UniqueName: \"kubernetes.io/projected/6b9a74a6-19b0-4ab2-a047-9ff9c13137d7-kube-api-access-4gvlz\") pod \"barbican6e73-account-delete-49s2r\" (UID: \"6b9a74a6-19b0-4ab2-a047-9ff9c13137d7\") " pod="openstack/barbican6e73-account-delete-49s2r" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.407851 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnzs2\" (UniqueName: \"kubernetes.io/projected/ce1313f5-3aed-43a9-881d-bf61353ab6bd-kube-api-access-hnzs2\") pod \"glancec9ca-account-delete-csrvn\" (UID: \"ce1313f5-3aed-43a9-881d-bf61353ab6bd\") " pod="openstack/glancec9ca-account-delete-csrvn" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.427183 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8kjnt"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.464038 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnzs2\" (UniqueName: \"kubernetes.io/projected/ce1313f5-3aed-43a9-881d-bf61353ab6bd-kube-api-access-hnzs2\") pod \"glancec9ca-account-delete-csrvn\" (UID: \"ce1313f5-3aed-43a9-881d-bf61353ab6bd\") " pod="openstack/glancec9ca-account-delete-csrvn" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.485320 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancec9ca-account-delete-csrvn" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.491658 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican6e73-account-delete-49s2r" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.499306 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8kjnt"] Oct 02 16:41:35 crc kubenswrapper[4882]: E1002 16:41:35.641837 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 16:41:35 crc kubenswrapper[4882]: E1002 16:41:35.641946 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data podName:74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42 nodeName:}" failed. No retries permitted until 2025-10-02 16:41:36.641925163 +0000 UTC m=+1455.391154690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data") pod "rabbitmq-server-0" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42") : configmap "rabbitmq-config-data" not found Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.697313 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron0481-account-delete-lt5bg"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.698620 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0481-account-delete-lt5bg" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.713150 4882 generic.go:334] "Generic (PLEG): container finished" podID="65f4b488-20e6-4007-b1b4-891b06b16276" containerID="bcbf61e67fe3d94f3a2827ff0936322bdb1ccefdfe19185199aaf29bd9db4ce7" exitCode=0 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.713375 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d749c565-wv426" event={"ID":"65f4b488-20e6-4007-b1b4-891b06b16276","Type":"ContainerDied","Data":"bcbf61e67fe3d94f3a2827ff0936322bdb1ccefdfe19185199aaf29bd9db4ce7"} Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.728180 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron0481-account-delete-lt5bg"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.738963 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rhsqp_92887968-fdd5-4653-a151-70e4a8f963fc/openstack-network-exporter/0.log" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.739025 4882 generic.go:334] "Generic (PLEG): container finished" podID="92887968-fdd5-4653-a151-70e4a8f963fc" containerID="6bc9f5c33140d2b4ca0bbd08d5002572bc61b1f513ed811d7b913685891c44e2" exitCode=2 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.739143 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rhsqp" event={"ID":"92887968-fdd5-4653-a151-70e4a8f963fc","Type":"ContainerDied","Data":"6bc9f5c33140d2b4ca0bbd08d5002572bc61b1f513ed811d7b913685891c44e2"} Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.752609 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell050c5-account-delete-7t9wx"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.754605 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell050c5-account-delete-7t9wx" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.767979 4882 generic.go:334] "Generic (PLEG): container finished" podID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerID="6f83cafdb181c103837b92a2bd78dedec85779409ecdc413b2e067e64c5babcf" exitCode=2 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.768052 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"05fb59c5-aa61-4ec3-866f-3a4551737f80","Type":"ContainerDied","Data":"6f83cafdb181c103837b92a2bd78dedec85779409ecdc413b2e067e64c5babcf"} Oct 02 16:41:35 crc kubenswrapper[4882]: E1002 16:41:35.771914 4882 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-v7dcx" message=< Oct 02 16:41:35 crc kubenswrapper[4882]: Exiting ovn-controller (1) [ OK ] Oct 02 16:41:35 crc kubenswrapper[4882]: > Oct 02 16:41:35 crc kubenswrapper[4882]: E1002 16:41:35.771946 4882 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-v7dcx" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerName="ovn-controller" containerID="cri-o://1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.771982 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-v7dcx" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerName="ovn-controller" containerID="cri-o://1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7" gracePeriod=29 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.788264 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell050c5-account-delete-7t9wx"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.800756 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.801541 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" containerName="openstack-network-exporter" containerID="cri-o://1c434c360e363c534658d517b5e5ca0981d70ee9a4edbc3206e881217869a5a9" gracePeriod=300 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.825396 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-568599c566-7t7js"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.825776 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-568599c566-7t7js" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerName="placement-log" containerID="cri-o://9e298188f83c2dcdfd495c92ac971fa8c62421680bfe7edca5990f87e39223ce" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.826391 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-568599c566-7t7js" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerName="placement-api" containerID="cri-o://c61dc991f9c47938a3b335bd886b13edcae91852caa89730fe80338f674f2a36" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.842706 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapie3bb-account-delete-wjz8t"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.850403 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rdht\" (UniqueName: \"kubernetes.io/projected/261d06a1-c07d-4430-9984-24531fa935c6-kube-api-access-9rdht\") pod \"novacell050c5-account-delete-7t9wx\" (UID: \"261d06a1-c07d-4430-9984-24531fa935c6\") " pod="openstack/novacell050c5-account-delete-7t9wx" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.850818 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7mv\" (UniqueName: \"kubernetes.io/projected/6a85d34b-abed-4d9a-aa75-9781d96a4c8b-kube-api-access-7b7mv\") pod \"neutron0481-account-delete-lt5bg\" (UID: \"6a85d34b-abed-4d9a-aa75-9781d96a4c8b\") " pod="openstack/neutron0481-account-delete-lt5bg" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.852906 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapie3bb-account-delete-wjz8t" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.886430 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.887116 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerName="openstack-network-exporter" containerID="cri-o://7b2b5dfddc7012648f92825b22fc3dff9d26b581f27c153da193dcb7c40d999c" gracePeriod=300 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.898083 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.898559 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerName="cinder-scheduler" containerID="cri-o://8b9f83a977cbb31784f8c0ea343e36bdd7f6e161aef6bbb69620207d95bcb979" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.899830 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerName="probe" containerID="cri-o://08bb0f006c953ee6edc2f62804afed6ad5278a9319ec73877534ab1ea24041e5" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.909832 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapie3bb-account-delete-wjz8t"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.941106 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.941655 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-server" containerID="cri-o://d8caa8721d90f9ef2df17eeeff1e9b45a19acf29fff7c7ee1e067ac7a621cebe" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942126 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="swift-recon-cron" containerID="cri-o://beb784c14f77f12a3964b2ef2082da3578704e6f5267fe025604c315cd46e6ba" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942182 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="rsync" containerID="cri-o://01c2f3830e5d56cd56ba9cb9e3532e635f8aa8907530ce2426f74c73661d8d82" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942233 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-expirer" containerID="cri-o://7f14fdb5932d30e2872b1a5885f0b55773e7edd9a6545624f4e0a5cc89fc9950" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942266 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-updater" containerID="cri-o://a02d6df731a02fe06af482e56bb798e1405d30e9a7582568ae70058f44d8649b" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942301 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-auditor" containerID="cri-o://dd782c2cb9bfe49afd713e8f98727186b4594352b4f94c3334399dcf0f3ddcde" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942330 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-replicator" containerID="cri-o://c324410a34d486d3741d64c1759808560c80771c9771319557a7fa0a96becbf8" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942362 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-server" containerID="cri-o://64ceceaaaf2974dd1ac76651c1df1b0fc8e1bf419b93266c480f050b334d0268" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942396 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-updater" containerID="cri-o://98c3f10e6bd125fee0ba2ce83eeb60ba8e4bb2141fa7130276214ba1b0155863" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942424 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-auditor" containerID="cri-o://e68f25a45eb857be77e95e76cbef672ca88b8dfae0b6833226a9a98f31867462" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942454 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-replicator" containerID="cri-o://7c21495965895e4c5e03f6e2ccf050c5159064c4ed28968f122e21de15ea7bf0" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942484 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-server" containerID="cri-o://0df9344da4286c51066abdcf7f4044bd806c337ad8fb5c3fd6ad431458d2179a" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942520 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-reaper" containerID="cri-o://92d1d3fc20a44da92843f516fd7e287b2820813b7b752be600e88b83707ea9d9" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942551 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-auditor" containerID="cri-o://19db6cdfa5b1c6ddeac4c3af0c7dac82506480145ee4d50af2f88dd3e7515251" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.942592 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-replicator" containerID="cri-o://f6dd5331866217f7c3fd5cb62ed15451551f9d0a3146c188905b3c00fb8cb139" gracePeriod=30 Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.956559 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rdht\" (UniqueName: \"kubernetes.io/projected/261d06a1-c07d-4430-9984-24531fa935c6-kube-api-access-9rdht\") pod \"novacell050c5-account-delete-7t9wx\" (UID: \"261d06a1-c07d-4430-9984-24531fa935c6\") " pod="openstack/novacell050c5-account-delete-7t9wx" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.960746 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7mv\" (UniqueName: \"kubernetes.io/projected/6a85d34b-abed-4d9a-aa75-9781d96a4c8b-kube-api-access-7b7mv\") pod \"neutron0481-account-delete-lt5bg\" (UID: \"6a85d34b-abed-4d9a-aa75-9781d96a4c8b\") " pod="openstack/neutron0481-account-delete-lt5bg" Oct 02 16:41:35 crc kubenswrapper[4882]: I1002 16:41:35.960788 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfg96\" (UniqueName: \"kubernetes.io/projected/8c726379-ca3b-428c-8091-c1870692c652-kube-api-access-wfg96\") pod \"novaapie3bb-account-delete-wjz8t\" (UID: \"8c726379-ca3b-428c-8091-c1870692c652\") " pod="openstack/novaapie3bb-account-delete-wjz8t" Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.010722 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8rr77"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.027140 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8rr77"] Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.034137 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.060495 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rdht\" (UniqueName: \"kubernetes.io/projected/261d06a1-c07d-4430-9984-24531fa935c6-kube-api-access-9rdht\") pod \"novacell050c5-account-delete-7t9wx\" (UID: \"261d06a1-c07d-4430-9984-24531fa935c6\") " pod="openstack/novacell050c5-account-delete-7t9wx" Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.060509 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jnfn2"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.071415 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfg96\" (UniqueName: \"kubernetes.io/projected/8c726379-ca3b-428c-8091-c1870692c652-kube-api-access-wfg96\") pod \"novaapie3bb-account-delete-wjz8t\" (UID: \"8c726379-ca3b-428c-8091-c1870692c652\") " pod="openstack/novaapie3bb-account-delete-wjz8t" Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.071470 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.071816 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data podName:ad4f3fde-e95f-404d-baac-1c6238494afa nodeName:}" failed. No retries permitted until 2025-10-02 16:41:38.071758571 +0000 UTC m=+1456.820988098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa") : configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.101493 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jnfn2"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.104796 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfg96\" (UniqueName: \"kubernetes.io/projected/8c726379-ca3b-428c-8091-c1870692c652-kube-api-access-wfg96\") pod \"novaapie3bb-account-delete-wjz8t\" (UID: \"8c726379-ca3b-428c-8091-c1870692c652\") " pod="openstack/novaapie3bb-account-delete-wjz8t" Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.114157 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.119155 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.119292 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="ovn-northd" Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.146770 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7mv\" (UniqueName: \"kubernetes.io/projected/6a85d34b-abed-4d9a-aa75-9781d96a4c8b-kube-api-access-7b7mv\") pod \"neutron0481-account-delete-lt5bg\" (UID: \"6a85d34b-abed-4d9a-aa75-9781d96a4c8b\") " pod="openstack/neutron0481-account-delete-lt5bg" Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.226681 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nx9bj"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.268257 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-nx9bj"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.281655 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gdh2h"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.296681 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-w7rvw"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.299563 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gdh2h"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.309397 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement52af-account-delete-gwjd7"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.321869 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-w7rvw"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.345270 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-52af-account-create-rvw9j"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.359124 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-52af-account-create-rvw9j"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.377265 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerName="ovsdbserver-nb" containerID="cri-o://cabd7eae8001349116cafdd29dd1df7fb3b6e49cd24abaa8c9cf5171da8dae76" gracePeriod=300 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.406718 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.408376 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" containerName="ovsdbserver-sb" containerID="cri-o://3f058549c5a2bb644c18b7d33ab3953b22f72d0657caaa9bed2e0b3555256692" gracePeriod=300 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.502851 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerName="rabbitmq" containerID="cri-o://578c94adafe095435dbcb87f647011c96a388fb1edfe93dbe48b4ca0639d68b2" gracePeriod=604800 Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.549770 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7 is running failed: container process not found" containerID="1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.551746 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7 is running failed: container process not found" containerID="1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.558686 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7 is running failed: container process not found" containerID="1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.558778 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-v7dcx" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerName="ovn-controller" Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.574767 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.575058 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerName="glance-log" containerID="cri-o://12b272e78e66a0a2addd30c1a027807a59f148b85e71c5a2b8d9836fe9b8d843" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.579136 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerName="glance-httpd" containerID="cri-o://e93df0f0131f30a72dba182ec0f2f33b1dc2c6158f2a976f2348f6a3c71cbaaf" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.619419 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.619795 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api-log" containerID="cri-o://d5a7ca97a26025df40fa9bac55807a7e950fe94bb5ce25a333f4e5a7efd30156" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.620433 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api" containerID="cri-o://93e77877e930a932f31f96d8c2b7d4e11b5ab6f70e5fa57cb3caad68f215b647" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.658755 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.659149 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-log" containerID="cri-o://2d5462c744d32c02349363789c3e1d2d942136eca10c547d25ef94b8fbf2a4c0" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.659510 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-httpd" containerID="cri-o://45bb1b9d3f14a231f70b2267ff101ad41616e2eb0158bfb01cf9db8c6d56d8ca" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.672804 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7994f6475f-bw8mv"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.673353 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7994f6475f-bw8mv" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-api" containerID="cri-o://94c670bb838359315515a372b7547069910917af30838091af0996d23b21c9e8" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.673555 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7994f6475f-bw8mv" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-httpd" containerID="cri-o://0917f875d2bdf0cb2a5ee7365787c2bcba638e2049e9094611df5f28dc9f15e9" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.707164 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.711373 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data podName:74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42 nodeName:}" failed. No retries permitted until 2025-10-02 16:41:38.711345364 +0000 UTC m=+1457.460574891 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data") pod "rabbitmq-server-0" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42") : configmap "rabbitmq-config-data" not found Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.731318 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-dccccfdb8-h8ffg"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.731710 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerName="barbican-keystone-listener-log" containerID="cri-o://03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.732068 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerName="barbican-keystone-listener" containerID="cri-o://f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f" gracePeriod=30 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.761282 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-89xpx"] Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.811744 4882 generic.go:334] "Generic (PLEG): container finished" podID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerID="1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.837451 4882 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 02 16:41:36 crc kubenswrapper[4882]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 16:41:36 crc kubenswrapper[4882]: + source /usr/local/bin/container-scripts/functions Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNBridge=br-int Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNRemote=tcp:localhost:6642 Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNEncapType=geneve Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNAvailabilityZones= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ EnableChassisAsGateway=true Oct 02 16:41:36 crc kubenswrapper[4882]: ++ PhysicalNetworks= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNHostName= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 16:41:36 crc kubenswrapper[4882]: ++ ovs_dir=/var/lib/openvswitch Oct 02 16:41:36 crc kubenswrapper[4882]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 16:41:36 crc kubenswrapper[4882]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 16:41:36 crc kubenswrapper[4882]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + cleanup_ovsdb_server_semaphore Oct 02 16:41:36 crc kubenswrapper[4882]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 16:41:36 crc kubenswrapper[4882]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 16:41:36 crc kubenswrapper[4882]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-wsqlw" message=< Oct 02 16:41:36 crc kubenswrapper[4882]: Exiting ovsdb-server (5) [ OK ] Oct 02 16:41:36 crc kubenswrapper[4882]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 16:41:36 crc kubenswrapper[4882]: + source /usr/local/bin/container-scripts/functions Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNBridge=br-int Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNRemote=tcp:localhost:6642 Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNEncapType=geneve Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNAvailabilityZones= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ EnableChassisAsGateway=true Oct 02 16:41:36 crc kubenswrapper[4882]: ++ PhysicalNetworks= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNHostName= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 16:41:36 crc kubenswrapper[4882]: ++ ovs_dir=/var/lib/openvswitch Oct 02 16:41:36 crc kubenswrapper[4882]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 16:41:36 crc kubenswrapper[4882]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 16:41:36 crc kubenswrapper[4882]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + cleanup_ovsdb_server_semaphore Oct 02 16:41:36 crc kubenswrapper[4882]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 16:41:36 crc kubenswrapper[4882]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 16:41:36 crc kubenswrapper[4882]: > Oct 02 16:41:36 crc kubenswrapper[4882]: E1002 16:41:36.837512 4882 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 02 16:41:36 crc kubenswrapper[4882]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 16:41:36 crc kubenswrapper[4882]: + source /usr/local/bin/container-scripts/functions Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNBridge=br-int Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNRemote=tcp:localhost:6642 Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNEncapType=geneve Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNAvailabilityZones= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ EnableChassisAsGateway=true Oct 02 16:41:36 crc kubenswrapper[4882]: ++ PhysicalNetworks= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ OVNHostName= Oct 02 16:41:36 crc kubenswrapper[4882]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 16:41:36 crc kubenswrapper[4882]: ++ ovs_dir=/var/lib/openvswitch Oct 02 16:41:36 crc kubenswrapper[4882]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 16:41:36 crc kubenswrapper[4882]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 16:41:36 crc kubenswrapper[4882]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + sleep 0.5 Oct 02 16:41:36 crc kubenswrapper[4882]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 16:41:36 crc kubenswrapper[4882]: + cleanup_ovsdb_server_semaphore Oct 02 16:41:36 crc kubenswrapper[4882]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 16:41:36 crc kubenswrapper[4882]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 16:41:36 crc kubenswrapper[4882]: > pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" containerID="cri-o://6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.837589 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" containerID="cri-o://6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" gracePeriod=28 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.868890 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7132edf6-8a37-4230-a3a5-4703be721a78/ovsdbserver-sb/0.log" Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.868937 4882 generic.go:334] "Generic (PLEG): container finished" podID="7132edf6-8a37-4230-a3a5-4703be721a78" containerID="1c434c360e363c534658d517b5e5ca0981d70ee9a4edbc3206e881217869a5a9" exitCode=2 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.868957 4882 generic.go:334] "Generic (PLEG): container finished" podID="7132edf6-8a37-4230-a3a5-4703be721a78" containerID="3f058549c5a2bb644c18b7d33ab3953b22f72d0657caaa9bed2e0b3555256692" exitCode=143 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950835 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="01c2f3830e5d56cd56ba9cb9e3532e635f8aa8907530ce2426f74c73661d8d82" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950889 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="7f14fdb5932d30e2872b1a5885f0b55773e7edd9a6545624f4e0a5cc89fc9950" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950898 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="a02d6df731a02fe06af482e56bb798e1405d30e9a7582568ae70058f44d8649b" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950907 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="dd782c2cb9bfe49afd713e8f98727186b4594352b4f94c3334399dcf0f3ddcde" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950915 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="c324410a34d486d3741d64c1759808560c80771c9771319557a7fa0a96becbf8" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950923 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="98c3f10e6bd125fee0ba2ce83eeb60ba8e4bb2141fa7130276214ba1b0155863" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950954 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="e68f25a45eb857be77e95e76cbef672ca88b8dfae0b6833226a9a98f31867462" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950962 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="7c21495965895e4c5e03f6e2ccf050c5159064c4ed28968f122e21de15ea7bf0" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950972 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="92d1d3fc20a44da92843f516fd7e287b2820813b7b752be600e88b83707ea9d9" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950982 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="19db6cdfa5b1c6ddeac4c3af0c7dac82506480145ee4d50af2f88dd3e7515251" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.950992 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="f6dd5331866217f7c3fd5cb62ed15451551f9d0a3146c188905b3c00fb8cb139" exitCode=0 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.956195 4882 generic.go:334] "Generic (PLEG): container finished" podID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerID="9e298188f83c2dcdfd495c92ac971fa8c62421680bfe7edca5990f87e39223ce" exitCode=143 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.958000 4882 generic.go:334] "Generic (PLEG): container finished" podID="ed84812d-565f-4ffc-a886-8cbddb32db0e" containerID="3e8912cefdcd596bcc23c33fee09e9e99d8e66bf4762de8ad88089cdb917b1ac" exitCode=137 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.960101 4882 generic.go:334] "Generic (PLEG): container finished" podID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerID="7b2b5dfddc7012648f92825b22fc3dff9d26b581f27c153da193dcb7c40d999c" exitCode=2 Oct 02 16:41:36 crc kubenswrapper[4882]: I1002 16:41:36.960112 4882 generic.go:334] "Generic (PLEG): container finished" podID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerID="cabd7eae8001349116cafdd29dd1df7fb3b6e49cd24abaa8c9cf5171da8dae76" exitCode=0 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.009568 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" containerID="cri-o://8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" gracePeriod=28 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.018179 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1698fd61-b948-4440-a0c2-b4cb3a7f933c" path="/var/lib/kubelet/pods/1698fd61-b948-4440-a0c2-b4cb3a7f933c/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.020489 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf4f202-ceb4-4eb0-aa35-22313a7d95e3" path="/var/lib/kubelet/pods/2bf4f202-ceb4-4eb0-aa35-22313a7d95e3/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.021106 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34084c20-a5bd-437c-8e9b-f972e89bdc34" path="/var/lib/kubelet/pods/34084c20-a5bd-437c-8e9b-f972e89bdc34/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.023178 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39804fe6-5476-4f26-a743-83c1852229ec" path="/var/lib/kubelet/pods/39804fe6-5476-4f26-a743-83c1852229ec/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.024366 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41dfd05c-8ce6-4327-978f-1e936b530b16" path="/var/lib/kubelet/pods/41dfd05c-8ce6-4327-978f-1e936b530b16/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.025318 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a0980b-843c-4fdd-a638-fe28f7bf4491" path="/var/lib/kubelet/pods/68a0980b-843c-4fdd-a638-fe28f7bf4491/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.025906 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af4887d-ad2c-42e6-a473-88947a33d7cd" path="/var/lib/kubelet/pods/6af4887d-ad2c-42e6-a473-88947a33d7cd/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.027062 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1352ce-9050-4709-bffd-c834b8ef1cb0" path="/var/lib/kubelet/pods/6c1352ce-9050-4709-bffd-c834b8ef1cb0/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.027695 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8e6711-063d-42f2-afb6-426e7617d062" path="/var/lib/kubelet/pods/ef8e6711-063d-42f2-afb6-426e7617d062/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.055584 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3616734-9206-483f-a173-6fa0dffe1f82" path="/var/lib/kubelet/pods/f3616734-9206-483f-a173-6fa0dffe1f82/volumes" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056320 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican6e73-account-delete-49s2r"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056356 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-89xpx"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056378 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx" event={"ID":"e205184e-bcf8-498d-8a1a-bc1c8539c2ae","Type":"ContainerDied","Data":"1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056401 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6e73-account-create-qc4g4"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056416 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7132edf6-8a37-4230-a3a5-4703be721a78","Type":"ContainerDied","Data":"1c434c360e363c534658d517b5e5ca0981d70ee9a4edbc3206e881217869a5a9"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056427 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-68fcbbc6fc-4n89r"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056440 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7132edf6-8a37-4230-a3a5-4703be721a78","Type":"ContainerDied","Data":"3f058549c5a2bb644c18b7d33ab3953b22f72d0657caaa9bed2e0b3555256692"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056449 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"01c2f3830e5d56cd56ba9cb9e3532e635f8aa8907530ce2426f74c73661d8d82"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056461 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"7f14fdb5932d30e2872b1a5885f0b55773e7edd9a6545624f4e0a5cc89fc9950"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056470 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"a02d6df731a02fe06af482e56bb798e1405d30e9a7582568ae70058f44d8649b"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056479 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"dd782c2cb9bfe49afd713e8f98727186b4594352b4f94c3334399dcf0f3ddcde"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056487 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"c324410a34d486d3741d64c1759808560c80771c9771319557a7fa0a96becbf8"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056497 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"98c3f10e6bd125fee0ba2ce83eeb60ba8e4bb2141fa7130276214ba1b0155863"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056506 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"e68f25a45eb857be77e95e76cbef672ca88b8dfae0b6833226a9a98f31867462"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056515 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"7c21495965895e4c5e03f6e2ccf050c5159064c4ed28968f122e21de15ea7bf0"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056524 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"92d1d3fc20a44da92843f516fd7e287b2820813b7b752be600e88b83707ea9d9"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056532 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"19db6cdfa5b1c6ddeac4c3af0c7dac82506480145ee4d50af2f88dd3e7515251"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056540 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"f6dd5331866217f7c3fd5cb62ed15451551f9d0a3146c188905b3c00fb8cb139"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056549 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-568599c566-7t7js" event={"ID":"6537afcb-4015-45f6-bdb5-68e0625c6ea6","Type":"ContainerDied","Data":"9e298188f83c2dcdfd495c92ac971fa8c62421680bfe7edca5990f87e39223ce"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056561 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"142d02f0-5616-42b6-b6fc-b37df2639f8a","Type":"ContainerDied","Data":"7b2b5dfddc7012648f92825b22fc3dff9d26b581f27c153da193dcb7c40d999c"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056573 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"142d02f0-5616-42b6-b6fc-b37df2639f8a","Type":"ContainerDied","Data":"cabd7eae8001349116cafdd29dd1df7fb3b6e49cd24abaa8c9cf5171da8dae76"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.056794 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerName="barbican-worker-log" containerID="cri-o://1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.057358 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerName="barbican-worker" containerID="cri-o://b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.064584 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6e73-account-create-qc4g4"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.122282 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.122955 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-log" containerID="cri-o://e2ff269e176bc20d80aa95fcdf36998283cbc95b8397ab30cf540ffd4bf2d588" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.123473 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-metadata" containerID="cri-o://2d2f7b7472a6b146044b414ca3cbab36ecb87a0c178a2b350803295c0549c80d" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.149876 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qdmww"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.161054 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qdmww"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.176716 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancec9ca-account-delete-csrvn"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.187729 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c9ca-account-create-tgl7f"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.194063 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c9ca-account-create-tgl7f"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.239054 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0481-account-delete-lt5bg" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.294736 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.295130 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-log" containerID="cri-o://63647f011c653d8f4af95c4fa30a2a264c7bf4c99639ef5a76ed2e9bca13fa93" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.295554 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-api" containerID="cri-o://c0cade657e8d743e4938ba6a036b7b7b533715caf8cc87468d9eea57d5a85617" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.307948 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-56bd-account-create-bkfmj"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.323000 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-56bd-account-create-bkfmj"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.352071 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.363664 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-68897cb7f8-5wgv6"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.364447 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-68897cb7f8-5wgv6" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api-log" containerID="cri-o://81bedb9aeb702667b8a6d51fc9ff288fc91ae9eafee405c8744188c4fd827eed" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.364610 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-68897cb7f8-5wgv6" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api" containerID="cri-o://42bd27be6298749bedf886c1fdcb856b9c7bdde2ff094fd0bbcab807ca38fc43" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.373282 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-pdk2q"] Oct 02 16:41:37 crc kubenswrapper[4882]: E1002 16:41:37.375762 4882 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb80a4ca1_90cc_4c29_a2de_13b4db198cef.slice/crio-1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf2f3ba0_9530_4875_84e1_df99cc4761a6.slice/crio-08bb0f006c953ee6edc2f62804afed6ad5278a9319ec73877534ab1ea24041e5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a33ca09_ff99_44fd_a978_ef69315caf26.slice/crio-d5a7ca97a26025df40fa9bac55807a7e950fe94bb5ce25a333f4e5a7efd30156.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fd93bba_dd83_4256_952a_d60fd3cefef4.slice/crio-conmon-0917f875d2bdf0cb2a5ee7365787c2bcba638e2049e9094611df5f28dc9f15e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd26acb_1d48_48f3_b39d_b274bdcd3cce.slice/crio-d8caa8721d90f9ef2df17eeeff1e9b45a19acf29fff7c7ee1e067ac7a621cebe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd26acb_1d48_48f3_b39d_b274bdcd3cce.slice/crio-64ceceaaaf2974dd1ac76651c1df1b0fc8e1bf419b93266c480f050b334d0268.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a33ca09_ff99_44fd_a978_ef69315caf26.slice/crio-conmon-d5a7ca97a26025df40fa9bac55807a7e950fe94bb5ce25a333f4e5a7efd30156.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b37b99_d806_4dce_a73e_653f8ebc5567.slice/crio-e2ff269e176bc20d80aa95fcdf36998283cbc95b8397ab30cf540ffd4bf2d588.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0de09a9_9a37_4c03_abd4_002230d4f583.slice/crio-conmon-12b272e78e66a0a2addd30c1a027807a59f148b85e71c5a2b8d9836fe9b8d843.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b37b99_d806_4dce_a73e_653f8ebc5567.slice/crio-conmon-e2ff269e176bc20d80aa95fcdf36998283cbc95b8397ab30cf540ffd4bf2d588.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod909dd4dd_0b5d_4b6b_b64a_7c59cd26256f.slice/crio-63647f011c653d8f4af95c4fa30a2a264c7bf4c99639ef5a76ed2e9bca13fa93.scope\": RecentStats: unable to find data in memory cache]" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.378298 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hqqxc"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.427832 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hqqxc"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.437373 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-pdk2q"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.448935 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell050c5-account-delete-7t9wx" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.450050 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0481-account-delete-lt5bg"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.462351 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0481-account-create-crqcr"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.484230 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0481-account-create-crqcr"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.549281 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gv7hs"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.551178 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gv7hs"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.559281 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-50c5-account-create-25qqs"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.566312 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.573503 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapie3bb-account-delete-wjz8t" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.574447 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-50c5-account-create-25qqs"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.591277 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell050c5-account-delete-7t9wx"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.613498 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.613746 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e3bb-account-create-dlpct"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.631358 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4z7p2"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.640177 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerName="rabbitmq" containerID="cri-o://5681a9da320379f932adf71053bca8a31e077d6a1f4a09e0b83765c5625e4ade" gracePeriod=604800 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.640584 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684396 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-combined-ca-bundle\") pod \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684534 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run\") pod \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684638 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-config\") pod \"65f4b488-20e6-4007-b1b4-891b06b16276\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684674 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtkft\" (UniqueName: \"kubernetes.io/projected/65f4b488-20e6-4007-b1b4-891b06b16276-kube-api-access-xtkft\") pod \"65f4b488-20e6-4007-b1b4-891b06b16276\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684706 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-ovn-controller-tls-certs\") pod \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684749 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-swift-storage-0\") pod \"65f4b488-20e6-4007-b1b4-891b06b16276\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684787 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run-ovn\") pod \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684814 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-log-ovn\") pod \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684835 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-sb\") pod \"65f4b488-20e6-4007-b1b4-891b06b16276\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684862 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nclpr\" (UniqueName: \"kubernetes.io/projected/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-kube-api-access-nclpr\") pod \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684908 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-nb\") pod \"65f4b488-20e6-4007-b1b4-891b06b16276\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684950 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-svc\") pod \"65f4b488-20e6-4007-b1b4-891b06b16276\" (UID: \"65f4b488-20e6-4007-b1b4-891b06b16276\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.684972 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-scripts\") pod \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.685372 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapie3bb-account-delete-wjz8t"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.685436 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e205184e-bcf8-498d-8a1a-bc1c8539c2ae" (UID: "e205184e-bcf8-498d-8a1a-bc1c8539c2ae"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.685589 4882 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.685971 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run" (OuterVolumeSpecName: "var-run") pod "e205184e-bcf8-498d-8a1a-bc1c8539c2ae" (UID: "e205184e-bcf8-498d-8a1a-bc1c8539c2ae"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.687474 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-scripts" (OuterVolumeSpecName: "scripts") pod "e205184e-bcf8-498d-8a1a-bc1c8539c2ae" (UID: "e205184e-bcf8-498d-8a1a-bc1c8539c2ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.687518 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e205184e-bcf8-498d-8a1a-bc1c8539c2ae" (UID: "e205184e-bcf8-498d-8a1a-bc1c8539c2ae"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.715395 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f4b488-20e6-4007-b1b4-891b06b16276-kube-api-access-xtkft" (OuterVolumeSpecName: "kube-api-access-xtkft") pod "65f4b488-20e6-4007-b1b4-891b06b16276" (UID: "65f4b488-20e6-4007-b1b4-891b06b16276"). InnerVolumeSpecName "kube-api-access-xtkft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.732158 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.732475 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="bd2c7127-3671-4e79-aad9-01146803019e" containerName="nova-cell1-conductor-conductor" containerID="cri-o://7d2d4ce2c2e6521a23764d7d8191e9ba960363470732242e3f515311010cfa5e" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.808930 4882 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.808967 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.808976 4882 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.808986 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtkft\" (UniqueName: \"kubernetes.io/projected/65f4b488-20e6-4007-b1b4-891b06b16276-kube-api-access-xtkft\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.814175 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4gj46"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.826537 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4z7p2"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.832403 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-kube-api-access-nclpr" (OuterVolumeSpecName: "kube-api-access-nclpr") pod "e205184e-bcf8-498d-8a1a-bc1c8539c2ae" (UID: "e205184e-bcf8-498d-8a1a-bc1c8539c2ae"). InnerVolumeSpecName "kube-api-access-nclpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.832487 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e3bb-account-create-dlpct"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.875702 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4gj46"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.912586 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nclpr\" (UniqueName: \"kubernetes.io/projected/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-kube-api-access-nclpr\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.916337 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e205184e-bcf8-498d-8a1a-bc1c8539c2ae" (UID: "e205184e-bcf8-498d-8a1a-bc1c8539c2ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.918454 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lgmf6"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.928432 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.928806 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://59d76121f684bf647e82fb7939a01b0eef06443bf9081dd3691a4619a445253e" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.952613 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.952887 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="b883970f-7b20-4f83-9b05-3b0469caf183" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45" gracePeriod=30 Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.968705 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-config" (OuterVolumeSpecName: "config") pod "65f4b488-20e6-4007-b1b4-891b06b16276" (UID: "65f4b488-20e6-4007-b1b4-891b06b16276"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.970525 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lgmf6"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.977771 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "65f4b488-20e6-4007-b1b4-891b06b16276" (UID: "65f4b488-20e6-4007-b1b4-891b06b16276"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.981031 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzll4"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.988655 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d749c565-wv426" event={"ID":"65f4b488-20e6-4007-b1b4-891b06b16276","Type":"ContainerDied","Data":"d085527c20f1844eb1887caf8efbb33859e7d86fba7c8d43ee22201e205a8e86"} Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.988846 4882 scope.go:117] "RemoveContainer" containerID="bcbf61e67fe3d94f3a2827ff0936322bdb1ccefdfe19185199aaf29bd9db4ce7" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.989098 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d749c565-wv426" Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.991628 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:41:37 crc kubenswrapper[4882]: I1002 16:41:37.991950 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="06e85b8e-fa65-4016-adbb-e72100f18388" containerName="nova-scheduler-scheduler" containerID="cri-o://01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5" gracePeriod=30 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.007786 4882 generic.go:334] "Generic (PLEG): container finished" podID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerID="0917f875d2bdf0cb2a5ee7365787c2bcba638e2049e9094611df5f28dc9f15e9" exitCode=0 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.007920 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7994f6475f-bw8mv" event={"ID":"0fd93bba-dd83-4256-952a-d60fd3cefef4","Type":"ContainerDied","Data":"0917f875d2bdf0cb2a5ee7365787c2bcba638e2049e9094611df5f28dc9f15e9"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.013515 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "e205184e-bcf8-498d-8a1a-bc1c8539c2ae" (UID: "e205184e-bcf8-498d-8a1a-bc1c8539c2ae"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.016784 4882 generic.go:334] "Generic (PLEG): container finished" podID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerID="03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d" exitCode=143 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.016916 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" event={"ID":"d6733b4d-ebf1-43cd-9960-3c25fca82e64","Type":"ContainerDied","Data":"03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.017491 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-ovn-controller-tls-certs\") pod \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\" (UID: \"e205184e-bcf8-498d-8a1a-bc1c8539c2ae\") " Oct 02 16:41:38 crc kubenswrapper[4882]: W1002 16:41:38.018026 4882 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e205184e-bcf8-498d-8a1a-bc1c8539c2ae/volumes/kubernetes.io~secret/ovn-controller-tls-certs Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.018044 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "e205184e-bcf8-498d-8a1a-bc1c8539c2ae" (UID: "e205184e-bcf8-498d-8a1a-bc1c8539c2ae"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.019390 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.019420 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.019434 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.019447 4882 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e205184e-bcf8-498d-8a1a-bc1c8539c2ae-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.035919 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rhsqp_92887968-fdd5-4653-a151-70e4a8f963fc/openstack-network-exporter/0.log" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.035992 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.047919 4882 generic.go:334] "Generic (PLEG): container finished" podID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerID="2d5462c744d32c02349363789c3e1d2d942136eca10c547d25ef94b8fbf2a4c0" exitCode=143 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.048053 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8cf6351-a2e4-475f-a9f2-9006fee40049","Type":"ContainerDied","Data":"2d5462c744d32c02349363789c3e1d2d942136eca10c547d25ef94b8fbf2a4c0"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.053021 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "65f4b488-20e6-4007-b1b4-891b06b16276" (UID: "65f4b488-20e6-4007-b1b4-891b06b16276"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.053698 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.071725 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement52af-account-delete-gwjd7"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.074075 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"142d02f0-5616-42b6-b6fc-b37df2639f8a","Type":"ContainerDied","Data":"3a79108d8ebc383aa3c81dcda7cd8fa5e1b04e9eb9f44aa52da49fde5e6197e4"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.074129 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a79108d8ebc383aa3c81dcda7cd8fa5e1b04e9eb9f44aa52da49fde5e6197e4" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.079650 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.081995 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7132edf6-8a37-4230-a3a5-4703be721a78/ovsdbserver-sb/0.log" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.082076 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.082162 4882 scope.go:117] "RemoveContainer" containerID="51e94341fed25c5ed8e828868b22a1e16ae34b4b130c9a7bed213fad084185d0" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.085657 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="b156c0c6-4395-4609-8260-5ee8943d6813" containerName="galera" containerID="cri-o://ce86d9d2458ffecb5f7269b0144634e7551d8db7396b16d935a1e65eea3b1726" gracePeriod=30 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.087095 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "65f4b488-20e6-4007-b1b4-891b06b16276" (UID: "65f4b488-20e6-4007-b1b4-891b06b16276"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.087910 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "65f4b488-20e6-4007-b1b4-891b06b16276" (UID: "65f4b488-20e6-4007-b1b4-891b06b16276"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.111172 4882 generic.go:334] "Generic (PLEG): container finished" podID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerID="63647f011c653d8f4af95c4fa30a2a264c7bf4c99639ef5a76ed2e9bca13fa93" exitCode=143 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.111351 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f","Type":"ContainerDied","Data":"63647f011c653d8f4af95c4fa30a2a264c7bf4c99639ef5a76ed2e9bca13fa93"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120089 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config\") pod \"ed84812d-565f-4ffc-a886-8cbddb32db0e\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120206 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-metrics-certs-tls-certs\") pod \"92887968-fdd5-4653-a151-70e4a8f963fc\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120269 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2kq6\" (UniqueName: \"kubernetes.io/projected/92887968-fdd5-4653-a151-70e4a8f963fc-kube-api-access-c2kq6\") pod \"92887968-fdd5-4653-a151-70e4a8f963fc\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120307 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-combined-ca-bundle\") pod \"92887968-fdd5-4653-a151-70e4a8f963fc\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120330 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f68dg\" (UniqueName: \"kubernetes.io/projected/ed84812d-565f-4ffc-a886-8cbddb32db0e-kube-api-access-f68dg\") pod \"ed84812d-565f-4ffc-a886-8cbddb32db0e\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120355 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovs-rundir\") pod \"92887968-fdd5-4653-a151-70e4a8f963fc\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120406 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config-secret\") pod \"ed84812d-565f-4ffc-a886-8cbddb32db0e\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120468 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovn-rundir\") pod \"92887968-fdd5-4653-a151-70e4a8f963fc\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120496 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-combined-ca-bundle\") pod \"ed84812d-565f-4ffc-a886-8cbddb32db0e\" (UID: \"ed84812d-565f-4ffc-a886-8cbddb32db0e\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120525 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92887968-fdd5-4653-a151-70e4a8f963fc-config\") pod \"92887968-fdd5-4653-a151-70e4a8f963fc\" (UID: \"92887968-fdd5-4653-a151-70e4a8f963fc\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120956 4882 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120973 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.120983 4882 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65f4b488-20e6-4007-b1b4-891b06b16276-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: E1002 16:41:38.121039 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:38 crc kubenswrapper[4882]: E1002 16:41:38.121085 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data podName:ad4f3fde-e95f-404d-baac-1c6238494afa nodeName:}" failed. No retries permitted until 2025-10-02 16:41:42.121068842 +0000 UTC m=+1460.870298369 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa") : configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.121322 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "92887968-fdd5-4653-a151-70e4a8f963fc" (UID: "92887968-fdd5-4653-a151-70e4a8f963fc"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.121774 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92887968-fdd5-4653-a151-70e4a8f963fc-config" (OuterVolumeSpecName: "config") pod "92887968-fdd5-4653-a151-70e4a8f963fc" (UID: "92887968-fdd5-4653-a151-70e4a8f963fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.121985 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "92887968-fdd5-4653-a151-70e4a8f963fc" (UID: "92887968-fdd5-4653-a151-70e4a8f963fc"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.125248 4882 generic.go:334] "Generic (PLEG): container finished" podID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerID="08bb0f006c953ee6edc2f62804afed6ad5278a9319ec73877534ab1ea24041e5" exitCode=0 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.125316 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"af2f3ba0-9530-4875-84e1-df99cc4761a6","Type":"ContainerDied","Data":"08bb0f006c953ee6edc2f62804afed6ad5278a9319ec73877534ab1ea24041e5"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.127581 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92887968-fdd5-4653-a151-70e4a8f963fc-kube-api-access-c2kq6" (OuterVolumeSpecName: "kube-api-access-c2kq6") pod "92887968-fdd5-4653-a151-70e4a8f963fc" (UID: "92887968-fdd5-4653-a151-70e4a8f963fc"). InnerVolumeSpecName "kube-api-access-c2kq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.132783 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7132edf6-8a37-4230-a3a5-4703be721a78/ovsdbserver-sb/0.log" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.132869 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7132edf6-8a37-4230-a3a5-4703be721a78","Type":"ContainerDied","Data":"e2acd5cc3a0b8d34bc7e2567945a783842fddd34f7684be0d416c168d0faa381"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.132949 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.152022 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed84812d-565f-4ffc-a886-8cbddb32db0e-kube-api-access-f68dg" (OuterVolumeSpecName: "kube-api-access-f68dg") pod "ed84812d-565f-4ffc-a886-8cbddb32db0e" (UID: "ed84812d-565f-4ffc-a886-8cbddb32db0e"). InnerVolumeSpecName "kube-api-access-f68dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.152186 4882 scope.go:117] "RemoveContainer" containerID="1c434c360e363c534658d517b5e5ca0981d70ee9a4edbc3206e881217869a5a9" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.154945 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ed84812d-565f-4ffc-a886-8cbddb32db0e" (UID: "ed84812d-565f-4ffc-a886-8cbddb32db0e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.163554 4882 generic.go:334] "Generic (PLEG): container finished" podID="21667760-8ee1-456b-af11-a501cdf77822" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" exitCode=0 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.163633 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wsqlw" event={"ID":"21667760-8ee1-456b-af11-a501cdf77822","Type":"ContainerDied","Data":"6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.172898 4882 generic.go:334] "Generic (PLEG): container finished" podID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerID="e2ff269e176bc20d80aa95fcdf36998283cbc95b8397ab30cf540ffd4bf2d588" exitCode=143 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.172939 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5b37b99-d806-4dce-a73e-653f8ebc5567","Type":"ContainerDied","Data":"e2ff269e176bc20d80aa95fcdf36998283cbc95b8397ab30cf540ffd4bf2d588"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.174882 4882 generic.go:334] "Generic (PLEG): container finished" podID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerID="1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed" exitCode=143 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.174918 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" event={"ID":"b80a4ca1-90cc-4c29-a2de-13b4db198cef","Type":"ContainerDied","Data":"1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.176372 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.178142 4882 generic.go:334] "Generic (PLEG): container finished" podID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerID="12b272e78e66a0a2addd30c1a027807a59f148b85e71c5a2b8d9836fe9b8d843" exitCode=143 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.178178 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0de09a9-9a37-4c03-abd4-002230d4f583","Type":"ContainerDied","Data":"12b272e78e66a0a2addd30c1a027807a59f148b85e71c5a2b8d9836fe9b8d843"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.181225 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v7dcx" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.181340 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v7dcx" event={"ID":"e205184e-bcf8-498d-8a1a-bc1c8539c2ae","Type":"ContainerDied","Data":"7e1a0b6bad817661a14e4dbd53c4e6d723d2a62695a9e41d24346f02e1a6c19d"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.183298 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92887968-fdd5-4653-a151-70e4a8f963fc" (UID: "92887968-fdd5-4653-a151-70e4a8f963fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.186109 4882 generic.go:334] "Generic (PLEG): container finished" podID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerID="81bedb9aeb702667b8a6d51fc9ff288fc91ae9eafee405c8744188c4fd827eed" exitCode=143 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.186170 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68897cb7f8-5wgv6" event={"ID":"bb8a47fb-b921-4b63-9529-a49d1ec506fb","Type":"ContainerDied","Data":"81bedb9aeb702667b8a6d51fc9ff288fc91ae9eafee405c8744188c4fd827eed"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.187025 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed84812d-565f-4ffc-a886-8cbddb32db0e" (UID: "ed84812d-565f-4ffc-a886-8cbddb32db0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.189893 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rhsqp_92887968-fdd5-4653-a151-70e4a8f963fc/openstack-network-exporter/0.log" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.189968 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rhsqp" event={"ID":"92887968-fdd5-4653-a151-70e4a8f963fc","Type":"ContainerDied","Data":"fbac9db1e92df8c7a67eb7a16d69de9b0b4b044771209e8e76103bb854fbf827"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.190038 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rhsqp" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.192194 4882 generic.go:334] "Generic (PLEG): container finished" podID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerID="7a6b59e62874a5415470e6e26597234895bad91ce2daf76f62f504d8922025ec" exitCode=0 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.192330 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzll4" event={"ID":"dc1acdb9-23dd-43cd-b568-0e6a04f0db71","Type":"ContainerDied","Data":"7a6b59e62874a5415470e6e26597234895bad91ce2daf76f62f504d8922025ec"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.192460 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzll4" event={"ID":"dc1acdb9-23dd-43cd-b568-0e6a04f0db71","Type":"ContainerStarted","Data":"0569fe3ebe9a74515d4b6735e93a13ed9e9165e9474837bc90084d4a0ad1fe70"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.213385 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="64ceceaaaf2974dd1ac76651c1df1b0fc8e1bf419b93266c480f050b334d0268" exitCode=0 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.213560 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="0df9344da4286c51066abdcf7f4044bd806c337ad8fb5c3fd6ad431458d2179a" exitCode=0 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.213682 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="d8caa8721d90f9ef2df17eeeff1e9b45a19acf29fff7c7ee1e067ac7a621cebe" exitCode=0 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.214172 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"64ceceaaaf2974dd1ac76651c1df1b0fc8e1bf419b93266c480f050b334d0268"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.214556 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"0df9344da4286c51066abdcf7f4044bd806c337ad8fb5c3fd6ad431458d2179a"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.214668 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"d8caa8721d90f9ef2df17eeeff1e9b45a19acf29fff7c7ee1e067ac7a621cebe"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.218912 4882 generic.go:334] "Generic (PLEG): container finished" podID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerID="d5a7ca97a26025df40fa9bac55807a7e950fe94bb5ce25a333f4e5a7efd30156" exitCode=143 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.218968 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a33ca09-ff99-44fd-a978-ef69315caf26","Type":"ContainerDied","Data":"d5a7ca97a26025df40fa9bac55807a7e950fe94bb5ce25a333f4e5a7efd30156"} Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.219192 4882 scope.go:117] "RemoveContainer" containerID="3f058549c5a2bb644c18b7d33ab3953b22f72d0657caaa9bed2e0b3555256692" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222219 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87l2b\" (UniqueName: \"kubernetes.io/projected/142d02f0-5616-42b6-b6fc-b37df2639f8a-kube-api-access-87l2b\") pod \"142d02f0-5616-42b6-b6fc-b37df2639f8a\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222278 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-combined-ca-bundle\") pod \"7132edf6-8a37-4230-a3a5-4703be721a78\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222329 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdb-rundir\") pod \"142d02f0-5616-42b6-b6fc-b37df2639f8a\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222368 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdbserver-sb-tls-certs\") pod \"7132edf6-8a37-4230-a3a5-4703be721a78\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222441 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-combined-ca-bundle\") pod \"142d02f0-5616-42b6-b6fc-b37df2639f8a\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222512 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-scripts\") pod \"142d02f0-5616-42b6-b6fc-b37df2639f8a\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222574 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-metrics-certs-tls-certs\") pod \"142d02f0-5616-42b6-b6fc-b37df2639f8a\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222625 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-config\") pod \"142d02f0-5616-42b6-b6fc-b37df2639f8a\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222670 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdb-rundir\") pod \"7132edf6-8a37-4230-a3a5-4703be721a78\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222699 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-scripts\") pod \"7132edf6-8a37-4230-a3a5-4703be721a78\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222727 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdbserver-nb-tls-certs\") pod \"142d02f0-5616-42b6-b6fc-b37df2639f8a\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222747 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzg6d\" (UniqueName: \"kubernetes.io/projected/7132edf6-8a37-4230-a3a5-4703be721a78-kube-api-access-bzg6d\") pod \"7132edf6-8a37-4230-a3a5-4703be721a78\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222789 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"142d02f0-5616-42b6-b6fc-b37df2639f8a\" (UID: \"142d02f0-5616-42b6-b6fc-b37df2639f8a\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222827 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-metrics-certs-tls-certs\") pod \"7132edf6-8a37-4230-a3a5-4703be721a78\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222851 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7132edf6-8a37-4230-a3a5-4703be721a78\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.222882 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-config\") pod \"7132edf6-8a37-4230-a3a5-4703be721a78\" (UID: \"7132edf6-8a37-4230-a3a5-4703be721a78\") " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.223814 4882 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.223833 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.223848 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92887968-fdd5-4653-a151-70e4a8f963fc-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.223860 4882 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.223870 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2kq6\" (UniqueName: \"kubernetes.io/projected/92887968-fdd5-4653-a151-70e4a8f963fc-kube-api-access-c2kq6\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.223879 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.223891 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f68dg\" (UniqueName: \"kubernetes.io/projected/ed84812d-565f-4ffc-a886-8cbddb32db0e-kube-api-access-f68dg\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.223900 4882 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/92887968-fdd5-4653-a151-70e4a8f963fc-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.225207 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-config" (OuterVolumeSpecName: "config") pod "7132edf6-8a37-4230-a3a5-4703be721a78" (UID: "7132edf6-8a37-4230-a3a5-4703be721a78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.225761 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-scripts" (OuterVolumeSpecName: "scripts") pod "7132edf6-8a37-4230-a3a5-4703be721a78" (UID: "7132edf6-8a37-4230-a3a5-4703be721a78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.226920 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "7132edf6-8a37-4230-a3a5-4703be721a78" (UID: "7132edf6-8a37-4230-a3a5-4703be721a78"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.227463 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-config" (OuterVolumeSpecName: "config") pod "142d02f0-5616-42b6-b6fc-b37df2639f8a" (UID: "142d02f0-5616-42b6-b6fc-b37df2639f8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.229337 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "142d02f0-5616-42b6-b6fc-b37df2639f8a" (UID: "142d02f0-5616-42b6-b6fc-b37df2639f8a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.231943 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-scripts" (OuterVolumeSpecName: "scripts") pod "142d02f0-5616-42b6-b6fc-b37df2639f8a" (UID: "142d02f0-5616-42b6-b6fc-b37df2639f8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.250135 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142d02f0-5616-42b6-b6fc-b37df2639f8a-kube-api-access-87l2b" (OuterVolumeSpecName: "kube-api-access-87l2b") pod "142d02f0-5616-42b6-b6fc-b37df2639f8a" (UID: "142d02f0-5616-42b6-b6fc-b37df2639f8a"). InnerVolumeSpecName "kube-api-access-87l2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.277174 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "142d02f0-5616-42b6-b6fc-b37df2639f8a" (UID: "142d02f0-5616-42b6-b6fc-b37df2639f8a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.277463 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "7132edf6-8a37-4230-a3a5-4703be721a78" (UID: "7132edf6-8a37-4230-a3a5-4703be721a78"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.317421 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7132edf6-8a37-4230-a3a5-4703be721a78-kube-api-access-bzg6d" (OuterVolumeSpecName: "kube-api-access-bzg6d") pod "7132edf6-8a37-4230-a3a5-4703be721a78" (UID: "7132edf6-8a37-4230-a3a5-4703be721a78"). InnerVolumeSpecName "kube-api-access-bzg6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.342394 4882 scope.go:117] "RemoveContainer" containerID="3e8912cefdcd596bcc23c33fee09e9e99d8e66bf4762de8ad88089cdb917b1ac" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.352953 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.352982 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142d02f0-5616-42b6-b6fc-b37df2639f8a-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.352991 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.353002 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.353011 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzg6d\" (UniqueName: \"kubernetes.io/projected/7132edf6-8a37-4230-a3a5-4703be721a78-kube-api-access-bzg6d\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.353030 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.353042 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.353051 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7132edf6-8a37-4230-a3a5-4703be721a78-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.353060 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87l2b\" (UniqueName: \"kubernetes.io/projected/142d02f0-5616-42b6-b6fc-b37df2639f8a-kube-api-access-87l2b\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.353068 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.368474 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-54f5c99697-qjljg"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.368920 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-54f5c99697-qjljg" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerName="proxy-httpd" containerID="cri-o://130c4a62c2b640eaef7a5ead25af6e7d9449b34010c056f857dac87df15f4c2a" gracePeriod=30 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.369067 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-54f5c99697-qjljg" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerName="proxy-server" containerID="cri-o://df07f33c6b476688e48e7aac7112b9a252910e43159a82f95d8b350ed2eaa481" gracePeriod=30 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.405137 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "92887968-fdd5-4653-a151-70e4a8f963fc" (UID: "92887968-fdd5-4653-a151-70e4a8f963fc"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.408598 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v7dcx"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.416993 4882 scope.go:117] "RemoveContainer" containerID="1f2d8848b77e9534e43baa9a1a70a74df78b609eeb03d5849f82614fd931c8f7" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.423538 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v7dcx"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.425076 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7132edf6-8a37-4230-a3a5-4703be721a78" (UID: "7132edf6-8a37-4230-a3a5-4703be721a78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.427810 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cindera969-account-delete-pgcm7"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.434264 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d749c565-wv426"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.441252 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d749c565-wv426"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.446599 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancec9ca-account-delete-csrvn"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.454304 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/92887968-fdd5-4653-a151-70e4a8f963fc-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.454335 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.543233 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "142d02f0-5616-42b6-b6fc-b37df2639f8a" (UID: "142d02f0-5616-42b6-b6fc-b37df2639f8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.558520 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.597448 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican6e73-account-delete-49s2r"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.607416 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell050c5-account-delete-7t9wx"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.634118 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapie3bb-account-delete-wjz8t"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.674607 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0481-account-delete-lt5bg"] Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.691155 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.722232 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.730409 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "142d02f0-5616-42b6-b6fc-b37df2639f8a" (UID: "142d02f0-5616-42b6-b6fc-b37df2639f8a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: W1002 16:41:38.737619 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c726379_ca3b_428c_8091_c1870692c652.slice/crio-a75932169d93f7f6c0536bdd0582c27f913497562a2e6b65fd183bc8593d7af0 WatchSource:0}: Error finding container a75932169d93f7f6c0536bdd0582c27f913497562a2e6b65fd183bc8593d7af0: Status 404 returned error can't find the container with id a75932169d93f7f6c0536bdd0582c27f913497562a2e6b65fd183bc8593d7af0 Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.763767 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.764400 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.764427 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: E1002 16:41:38.765436 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 16:41:38 crc kubenswrapper[4882]: E1002 16:41:38.765672 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data podName:74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42 nodeName:}" failed. No retries permitted until 2025-10-02 16:41:42.765643449 +0000 UTC m=+1461.514872976 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data") pod "rabbitmq-server-0" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42") : configmap "rabbitmq-config-data" not found Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.771053 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ed84812d-565f-4ffc-a886-8cbddb32db0e" (UID: "ed84812d-565f-4ffc-a886-8cbddb32db0e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.814679 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1acce7dc-c882-4dab-9e8e-09bc781e559a" path="/var/lib/kubelet/pods/1acce7dc-c882-4dab-9e8e-09bc781e559a/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.815271 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="222d830a-b239-42ec-9ad1-2fb0dbe9d0fb" path="/var/lib/kubelet/pods/222d830a-b239-42ec-9ad1-2fb0dbe9d0fb/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.815758 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c3b95d-74fa-4e1d-a3ed-422750068a7d" path="/var/lib/kubelet/pods/29c3b95d-74fa-4e1d-a3ed-422750068a7d/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.833555 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0e5b41-6a96-453c-a2e1-b69c89186d5b" path="/var/lib/kubelet/pods/3c0e5b41-6a96-453c-a2e1-b69c89186d5b/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.834560 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4991e502-56d0-483d-b0cb-8c5d2e3d9282" path="/var/lib/kubelet/pods/4991e502-56d0-483d-b0cb-8c5d2e3d9282/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.835072 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c9678e-5c7a-4a53-9e13-732c9bbc2cba" path="/var/lib/kubelet/pods/58c9678e-5c7a-4a53-9e13-732c9bbc2cba/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.848994 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f4b488-20e6-4007-b1b4-891b06b16276" path="/var/lib/kubelet/pods/65f4b488-20e6-4007-b1b4-891b06b16276/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.849845 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71290daa-6e18-4227-9a32-b73f3c816d89" path="/var/lib/kubelet/pods/71290daa-6e18-4227-9a32-b73f3c816d89/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.850323 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738df5a4-f251-460f-9d47-51a4b20967ad" path="/var/lib/kubelet/pods/738df5a4-f251-460f-9d47-51a4b20967ad/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.850815 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843b17cb-e35a-455c-94c4-7afe31780c91" path="/var/lib/kubelet/pods/843b17cb-e35a-455c-94c4-7afe31780c91/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.853455 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b09392d-ac00-4006-a716-d489b76594ed" path="/var/lib/kubelet/pods/9b09392d-ac00-4006-a716-d489b76594ed/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.853924 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c62e6f10-05a5-4e97-ae04-3d312702455d" path="/var/lib/kubelet/pods/c62e6f10-05a5-4e97-ae04-3d312702455d/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.854451 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" path="/var/lib/kubelet/pods/e205184e-bcf8-498d-8a1a-bc1c8539c2ae/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.858047 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c754b5-9feb-467d-97e9-3ff17db97760" path="/var/lib/kubelet/pods/e6c754b5-9feb-467d-97e9-3ff17db97760/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.858592 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed84812d-565f-4ffc-a886-8cbddb32db0e" path="/var/lib/kubelet/pods/ed84812d-565f-4ffc-a886-8cbddb32db0e/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.859035 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2e66594-8ce0-4953-a294-039c0d04d069" path="/var/lib/kubelet/pods/f2e66594-8ce0-4953-a294-039c0d04d069/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.864435 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff35d0a9-776d-4b61-8714-c5632ef1f4a6" path="/var/lib/kubelet/pods/ff35d0a9-776d-4b61-8714-c5632ef1f4a6/volumes" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.866557 4882 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ed84812d-565f-4ffc-a886-8cbddb32db0e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.897289 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.914818 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7132edf6-8a37-4230-a3a5-4703be721a78" (UID: "7132edf6-8a37-4230-a3a5-4703be721a78"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.932904 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "7132edf6-8a37-4230-a3a5-4703be721a78" (UID: "7132edf6-8a37-4230-a3a5-4703be721a78"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.943965 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.981177 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:38 crc kubenswrapper[4882]: I1002 16:41:38.981225 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7132edf6-8a37-4230-a3a5-4703be721a78-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.017015 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "142d02f0-5616-42b6-b6fc-b37df2639f8a" (UID: "142d02f0-5616-42b6-b6fc-b37df2639f8a"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.096045 4882 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/142d02f0-5616-42b6-b6fc-b37df2639f8a-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.267636 4882 generic.go:334] "Generic (PLEG): container finished" podID="b156c0c6-4395-4609-8260-5ee8943d6813" containerID="ce86d9d2458ffecb5f7269b0144634e7551d8db7396b16d935a1e65eea3b1726" exitCode=0 Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.291249 4882 generic.go:334] "Generic (PLEG): container finished" podID="e0ac3371-e1ab-4d5d-a543-9b9b68a0118a" containerID="2155a8ead6611495ac2656cb91200c865eee705b0367cd61131e9b9efc40fcda" exitCode=0 Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.367594 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0481-account-delete-lt5bg" event={"ID":"6a85d34b-abed-4d9a-aa75-9781d96a4c8b","Type":"ContainerStarted","Data":"c7ea1be20bb1a5e7036c38bd966626c7e2305fbd438fa7acd35f4d593c017765"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.367639 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b156c0c6-4395-4609-8260-5ee8943d6813","Type":"ContainerDied","Data":"ce86d9d2458ffecb5f7269b0144634e7551d8db7396b16d935a1e65eea3b1726"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.367658 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement52af-account-delete-gwjd7" event={"ID":"e0ac3371-e1ab-4d5d-a543-9b9b68a0118a","Type":"ContainerDied","Data":"2155a8ead6611495ac2656cb91200c865eee705b0367cd61131e9b9efc40fcda"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.367675 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement52af-account-delete-gwjd7" event={"ID":"e0ac3371-e1ab-4d5d-a543-9b9b68a0118a","Type":"ContainerStarted","Data":"d4f11f19829a63c1469b2e48510a08a0a86d0347531add1f3d36574c30e4c014"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.367692 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican6e73-account-delete-49s2r" event={"ID":"6b9a74a6-19b0-4ab2-a047-9ff9c13137d7","Type":"ContainerStarted","Data":"16233244175ba86c65f647d97a08985f6039ba582598ad75192f27c67f742768"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.387477 4882 generic.go:334] "Generic (PLEG): container finished" podID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerID="c61dc991f9c47938a3b335bd886b13edcae91852caa89730fe80338f674f2a36" exitCode=0 Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.387570 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-568599c566-7t7js" event={"ID":"6537afcb-4015-45f6-bdb5-68e0625c6ea6","Type":"ContainerDied","Data":"c61dc991f9c47938a3b335bd886b13edcae91852caa89730fe80338f674f2a36"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.397820 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancec9ca-account-delete-csrvn" event={"ID":"ce1313f5-3aed-43a9-881d-bf61353ab6bd","Type":"ContainerStarted","Data":"f53758632758ed8cde1e7558fff569434b2d069520d4ee62de24b7655f12d844"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.400009 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindera969-account-delete-pgcm7" event={"ID":"bb7cef74-9b55-4213-a934-7c1d2c058aab","Type":"ContainerStarted","Data":"c7c8f20f0aa3b08760289bcc88e807fce20aff7bc381cbd60edcf8168e662923"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.401470 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell050c5-account-delete-7t9wx" event={"ID":"261d06a1-c07d-4430-9984-24531fa935c6","Type":"ContainerStarted","Data":"c54c38cc5cd5edb23cbba0efa73190a8814f344f25cf5ef0ee52cc158cf4f02b"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.404648 4882 generic.go:334] "Generic (PLEG): container finished" podID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerID="df07f33c6b476688e48e7aac7112b9a252910e43159a82f95d8b350ed2eaa481" exitCode=0 Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.404668 4882 generic.go:334] "Generic (PLEG): container finished" podID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerID="130c4a62c2b640eaef7a5ead25af6e7d9449b34010c056f857dac87df15f4c2a" exitCode=0 Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.404698 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54f5c99697-qjljg" event={"ID":"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636","Type":"ContainerDied","Data":"df07f33c6b476688e48e7aac7112b9a252910e43159a82f95d8b350ed2eaa481"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.404718 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54f5c99697-qjljg" event={"ID":"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636","Type":"ContainerDied","Data":"130c4a62c2b640eaef7a5ead25af6e7d9449b34010c056f857dac87df15f4c2a"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.414541 4882 generic.go:334] "Generic (PLEG): container finished" podID="0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" containerID="59d76121f684bf647e82fb7939a01b0eef06443bf9081dd3691a4619a445253e" exitCode=0 Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.414638 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc","Type":"ContainerDied","Data":"59d76121f684bf647e82fb7939a01b0eef06443bf9081dd3691a4619a445253e"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.416536 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapie3bb-account-delete-wjz8t" event={"ID":"8c726379-ca3b-428c-8091-c1870692c652","Type":"ContainerStarted","Data":"a75932169d93f7f6c0536bdd0582c27f913497562a2e6b65fd183bc8593d7af0"} Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.416558 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.466462 4882 scope.go:117] "RemoveContainer" containerID="6bc9f5c33140d2b4ca0bbd08d5002572bc61b1f513ed811d7b913685891c44e2" Oct 02 16:41:39 crc kubenswrapper[4882]: E1002 16:41:39.612683 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7d2d4ce2c2e6521a23764d7d8191e9ba960363470732242e3f515311010cfa5e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 16:41:39 crc kubenswrapper[4882]: E1002 16:41:39.613786 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7d2d4ce2c2e6521a23764d7d8191e9ba960363470732242e3f515311010cfa5e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 16:41:39 crc kubenswrapper[4882]: E1002 16:41:39.615437 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7d2d4ce2c2e6521a23764d7d8191e9ba960363470732242e3f515311010cfa5e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 16:41:39 crc kubenswrapper[4882]: E1002 16:41:39.615506 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="bd2c7127-3671-4e79-aad9-01146803019e" containerName="nova-cell1-conductor-conductor" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.643355 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h77k7"] Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.655089 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h77k7"] Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.666319 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a969-account-create-xfhzm"] Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.678820 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a969-account-create-xfhzm"] Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.683677 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.686495 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cindera969-account-delete-pgcm7"] Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.748486 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-rhsqp"] Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.748922 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-rhsqp"] Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.819047 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-nova-novncproxy-tls-certs\") pod \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.819093 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-combined-ca-bundle\") pod \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.819131 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-vencrypt-tls-certs\") pod \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.819166 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24pph\" (UniqueName: \"kubernetes.io/projected/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-kube-api-access-24pph\") pod \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.819255 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-config-data\") pod \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.821077 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.846918 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-kube-api-access-24pph" (OuterVolumeSpecName: "kube-api-access-24pph") pod "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" (UID: "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc"). InnerVolumeSpecName "kube-api-access-24pph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.932331 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" (UID: "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.934014 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-operator-scripts\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.938649 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-kolla-config\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.938983 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-secrets\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.939562 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.939790 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-combined-ca-bundle\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.940124 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sb95\" (UniqueName: \"kubernetes.io/projected/b156c0c6-4395-4609-8260-5ee8943d6813-kube-api-access-9sb95\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.940520 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-default\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.940934 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-galera-tls-certs\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.942444 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-combined-ca-bundle\") pod \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\" (UID: \"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.942545 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-generated\") pod \"b156c0c6-4395-4609-8260-5ee8943d6813\" (UID: \"b156c0c6-4395-4609-8260-5ee8943d6813\") " Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.943244 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24pph\" (UniqueName: \"kubernetes.io/projected/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-kube-api-access-24pph\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.940679 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: W1002 16:41:39.944138 4882 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc/volumes/kubernetes.io~secret/combined-ca-bundle Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.944166 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" (UID: "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.944843 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.951370 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.951580 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b156c0c6-4395-4609-8260-5ee8943d6813-kube-api-access-9sb95" (OuterVolumeSpecName: "kube-api-access-9sb95") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "kube-api-access-9sb95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.953183 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.958854 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-secrets" (OuterVolumeSpecName: "secrets") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.986685 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-config-data" (OuterVolumeSpecName: "config-data") pod "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" (UID: "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:39 crc kubenswrapper[4882]: I1002 16:41:39.998994 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.038906 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.047345 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" (UID: "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.049650 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.049681 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.049692 4882 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.049705 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b156c0c6-4395-4609-8260-5ee8943d6813-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.050638 4882 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.050683 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.050694 4882 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b156c0c6-4395-4609-8260-5ee8943d6813-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.050704 4882 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.050755 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.074114 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.074130 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sb95\" (UniqueName: \"kubernetes.io/projected/b156c0c6-4395-4609-8260-5ee8943d6813-kube-api-access-9sb95\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.121330 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement52af-account-delete-gwjd7" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.131431 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.146360 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.169942 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.179:9292/healthcheck\": read tcp 10.217.0.2:56116->10.217.0.179:9292: read: connection reset by peer" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.169899 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.179:9292/healthcheck\": read tcp 10.217.0.2:56106->10.217.0.179:9292: read: connection reset by peer" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.181369 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.184876 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": read tcp 10.217.0.2:53422->10.217.0.166:8776: read: connection reset by peer" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.188414 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "b156c0c6-4395-4609-8260-5ee8943d6813" (UID: "b156c0c6-4395-4609-8260-5ee8943d6813"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.192482 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.198433 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.217409 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-568599c566-7t7js" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.246473 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.276955 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sz8t\" (UniqueName: \"kubernetes.io/projected/e0ac3371-e1ab-4d5d-a543-9b9b68a0118a-kube-api-access-4sz8t\") pod \"e0ac3371-e1ab-4d5d-a543-9b9b68a0118a\" (UID: \"e0ac3371-e1ab-4d5d-a543-9b9b68a0118a\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.276997 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6733b4d-ebf1-43cd-9960-3c25fca82e64-logs\") pod \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277032 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b80a4ca1-90cc-4c29-a2de-13b4db198cef-logs\") pod \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277050 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-combined-ca-bundle\") pod \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277103 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data-custom\") pod \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277129 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data\") pod \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277221 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data-custom\") pod \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277254 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6h9q\" (UniqueName: \"kubernetes.io/projected/b80a4ca1-90cc-4c29-a2de-13b4db198cef-kube-api-access-d6h9q\") pod \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277277 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-combined-ca-bundle\") pod \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\" (UID: \"b80a4ca1-90cc-4c29-a2de-13b4db198cef\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277313 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znmr2\" (UniqueName: \"kubernetes.io/projected/d6733b4d-ebf1-43cd-9960-3c25fca82e64-kube-api-access-znmr2\") pod \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277348 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data\") pod \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\" (UID: \"d6733b4d-ebf1-43cd-9960-3c25fca82e64\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277688 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.277738 4882 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b156c0c6-4395-4609-8260-5ee8943d6813-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.288826 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b80a4ca1-90cc-4c29-a2de-13b4db198cef-logs" (OuterVolumeSpecName: "logs") pod "b80a4ca1-90cc-4c29-a2de-13b4db198cef" (UID: "b80a4ca1-90cc-4c29-a2de-13b4db198cef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.289192 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6733b4d-ebf1-43cd-9960-3c25fca82e64-logs" (OuterVolumeSpecName: "logs") pod "d6733b4d-ebf1-43cd-9960-3c25fca82e64" (UID: "d6733b4d-ebf1-43cd-9960-3c25fca82e64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379450 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-public-tls-certs\") pod \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379514 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6537afcb-4015-45f6-bdb5-68e0625c6ea6-logs\") pod \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379537 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-internal-tls-certs\") pod \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379565 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z22qw\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-kube-api-access-z22qw\") pod \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379597 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-combined-ca-bundle\") pod \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379644 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-internal-tls-certs\") pod \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379713 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-etc-swift\") pod \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379781 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-config-data\") pod \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379837 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-combined-ca-bundle\") pod \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379856 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-run-httpd\") pod \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379874 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-config-data\") pod \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379924 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-log-httpd\") pod \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\" (UID: \"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.379981 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-public-tls-certs\") pod \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.380010 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-scripts\") pod \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.380027 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5nbp\" (UniqueName: \"kubernetes.io/projected/6537afcb-4015-45f6-bdb5-68e0625c6ea6-kube-api-access-c5nbp\") pod \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.380389 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b80a4ca1-90cc-4c29-a2de-13b4db198cef-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.380399 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6733b4d-ebf1-43cd-9960-3c25fca82e64-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.396670 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" (UID: "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.410607 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6537afcb-4015-45f6-bdb5-68e0625c6ea6-logs" (OuterVolumeSpecName: "logs") pod "6537afcb-4015-45f6-bdb5-68e0625c6ea6" (UID: "6537afcb-4015-45f6-bdb5-68e0625c6ea6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.427358 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" (UID: "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.482830 4882 generic.go:334] "Generic (PLEG): container finished" podID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerID="b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224" exitCode=0 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.482946 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" event={"ID":"b80a4ca1-90cc-4c29-a2de-13b4db198cef","Type":"ContainerDied","Data":"b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.482978 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" event={"ID":"b80a4ca1-90cc-4c29-a2de-13b4db198cef","Type":"ContainerDied","Data":"847a6b5f37acf49bf72646e223241418883ec7a098e17e1719709fcb52776e86"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.482998 4882 scope.go:117] "RemoveContainer" containerID="b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.483075 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68fcbbc6fc-4n89r" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.484640 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.484662 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.484672 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6537afcb-4015-45f6-bdb5-68e0625c6ea6-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.491985 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b80a4ca1-90cc-4c29-a2de-13b4db198cef" (UID: "b80a4ca1-90cc-4c29-a2de-13b4db198cef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.501686 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d6733b4d-ebf1-43cd-9960-3c25fca82e64" (UID: "d6733b4d-ebf1-43cd-9960-3c25fca82e64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.502172 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b80a4ca1-90cc-4c29-a2de-13b4db198cef-kube-api-access-d6h9q" (OuterVolumeSpecName: "kube-api-access-d6h9q") pod "b80a4ca1-90cc-4c29-a2de-13b4db198cef" (UID: "b80a4ca1-90cc-4c29-a2de-13b4db198cef"). InnerVolumeSpecName "kube-api-access-d6h9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.504183 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ac3371-e1ab-4d5d-a543-9b9b68a0118a-kube-api-access-4sz8t" (OuterVolumeSpecName: "kube-api-access-4sz8t") pod "e0ac3371-e1ab-4d5d-a543-9b9b68a0118a" (UID: "e0ac3371-e1ab-4d5d-a543-9b9b68a0118a"). InnerVolumeSpecName "kube-api-access-4sz8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.505447 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6733b4d-ebf1-43cd-9960-3c25fca82e64-kube-api-access-znmr2" (OuterVolumeSpecName: "kube-api-access-znmr2") pod "d6733b4d-ebf1-43cd-9960-3c25fca82e64" (UID: "d6733b4d-ebf1-43cd-9960-3c25fca82e64"). InnerVolumeSpecName "kube-api-access-znmr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.507483 4882 generic.go:334] "Generic (PLEG): container finished" podID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerID="93e77877e930a932f31f96d8c2b7d4e11b5ab6f70e5fa57cb3caad68f215b647" exitCode=0 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.507568 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a33ca09-ff99-44fd-a978-ef69315caf26","Type":"ContainerDied","Data":"93e77877e930a932f31f96d8c2b7d4e11b5ab6f70e5fa57cb3caad68f215b647"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.517083 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" (UID: "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.521820 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-kube-api-access-z22qw" (OuterVolumeSpecName: "kube-api-access-z22qw") pod "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" (UID: "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636"). InnerVolumeSpecName "kube-api-access-z22qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.522499 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-scripts" (OuterVolumeSpecName: "scripts") pod "6537afcb-4015-45f6-bdb5-68e0625c6ea6" (UID: "6537afcb-4015-45f6-bdb5-68e0625c6ea6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.524455 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement52af-account-delete-gwjd7" event={"ID":"e0ac3371-e1ab-4d5d-a543-9b9b68a0118a","Type":"ContainerDied","Data":"d4f11f19829a63c1469b2e48510a08a0a86d0347531add1f3d36574c30e4c014"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.524600 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement52af-account-delete-gwjd7" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.529996 4882 generic.go:334] "Generic (PLEG): container finished" podID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerID="45bb1b9d3f14a231f70b2267ff101ad41616e2eb0158bfb01cf9db8c6d56d8ca" exitCode=0 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.530059 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8cf6351-a2e4-475f-a9f2-9006fee40049","Type":"ContainerDied","Data":"45bb1b9d3f14a231f70b2267ff101ad41616e2eb0158bfb01cf9db8c6d56d8ca"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.546417 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzll4" event={"ID":"dc1acdb9-23dd-43cd-b568-0e6a04f0db71","Type":"ContainerStarted","Data":"591e0b2c9d313cf5f0e413e1051aa9585b0f95b568ba3d808feaeffe7aa56741"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.550664 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:54630->10.217.0.202:8775: read: connection reset by peer" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.550758 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:54628->10.217.0.202:8775: read: connection reset by peer" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.565908 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-568599c566-7t7js" event={"ID":"6537afcb-4015-45f6-bdb5-68e0625c6ea6","Type":"ContainerDied","Data":"f1f9da0a3b102e508fb5fdb3b339242c8671cf3158ae59f00c1168bd693b7754"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.566080 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-568599c566-7t7js" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.573642 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6537afcb-4015-45f6-bdb5-68e0625c6ea6-kube-api-access-c5nbp" (OuterVolumeSpecName: "kube-api-access-c5nbp") pod "6537afcb-4015-45f6-bdb5-68e0625c6ea6" (UID: "6537afcb-4015-45f6-bdb5-68e0625c6ea6"). InnerVolumeSpecName "kube-api-access-c5nbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.574961 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc","Type":"ContainerDied","Data":"9f626802537d9b5bb34017058c5ae701d3df300f8cd131dcaf378d19d943f743"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.575065 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.587022 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6h9q\" (UniqueName: \"kubernetes.io/projected/b80a4ca1-90cc-4c29-a2de-13b4db198cef-kube-api-access-d6h9q\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.587060 4882 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.587073 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znmr2\" (UniqueName: \"kubernetes.io/projected/d6733b4d-ebf1-43cd-9960-3c25fca82e64-kube-api-access-znmr2\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.589471 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sz8t\" (UniqueName: \"kubernetes.io/projected/e0ac3371-e1ab-4d5d-a543-9b9b68a0118a-kube-api-access-4sz8t\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.589499 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.589509 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.589519 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5nbp\" (UniqueName: \"kubernetes.io/projected/6537afcb-4015-45f6-bdb5-68e0625c6ea6-kube-api-access-c5nbp\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.589528 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z22qw\" (UniqueName: \"kubernetes.io/projected/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-kube-api-access-z22qw\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.589537 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.606001 4882 generic.go:334] "Generic (PLEG): container finished" podID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerID="e93df0f0131f30a72dba182ec0f2f33b1dc2c6158f2a976f2348f6a3c71cbaaf" exitCode=0 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.606125 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0de09a9-9a37-4c03-abd4-002230d4f583","Type":"ContainerDied","Data":"e93df0f0131f30a72dba182ec0f2f33b1dc2c6158f2a976f2348f6a3c71cbaaf"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.609508 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.610514 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="proxy-httpd" containerID="cri-o://94742d7620ee6e9af50f0bdad8f628d680e7d044141c67f87aac9ef2bfa64a8b" gracePeriod=30 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.610774 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="sg-core" containerID="cri-o://513b374180132a0f297498712d1e51739fc2ad0c26615a542f982ac395df9d3e" gracePeriod=30 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.611603 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="ceilometer-notification-agent" containerID="cri-o://1f23372a0ffce94576fa7b8b12f78b5723d92783fb82f3c3a11c09be97afc71e" gracePeriod=30 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.618321 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="ceilometer-central-agent" containerID="cri-o://9a942b8c17e8463a8e6f76d1513de147a7fbccbc3c6bce75a888d236140df363" gracePeriod=30 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.627357 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.627636 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d7b05798-486d-493e-90eb-c09edb4bdc96" containerName="kube-state-metrics" containerID="cri-o://f281d92386df9b537dc64cd1c262d2a95b35ba4313d97c921f09342506f64857" gracePeriod=30 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.638879 4882 generic.go:334] "Generic (PLEG): container finished" podID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerID="f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f" exitCode=0 Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.639076 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.639608 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" event={"ID":"d6733b4d-ebf1-43cd-9960-3c25fca82e64","Type":"ContainerDied","Data":"f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.639903 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-dccccfdb8-h8ffg" event={"ID":"d6733b4d-ebf1-43cd-9960-3c25fca82e64","Type":"ContainerDied","Data":"0e92b1baf65adc1945adace925b27e61d8fd9ec5f85e128c321df60451542625"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.703545 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54f5c99697-qjljg" event={"ID":"2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636","Type":"ContainerDied","Data":"2c29c361246fc5eb606a4aa1b6de2f64776e3b080d6fc8470369063ff5f186c2"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.703662 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-54f5c99697-qjljg" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.753263 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b156c0c6-4395-4609-8260-5ee8943d6813","Type":"ContainerDied","Data":"9b0c2a27240f8fc32be35d9bfb98821d1b35755d06218c96ee1b3acf010a2692"} Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.758700 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.782863 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68897cb7f8-5wgv6" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:52120->10.217.0.165:9311: read: connection reset by peer" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.783385 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68897cb7f8-5wgv6" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:52114->10.217.0.165:9311: read: connection reset by peer" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.872768 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" path="/var/lib/kubelet/pods/142d02f0-5616-42b6-b6fc-b37df2639f8a/volumes" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.898493 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92887968-fdd5-4653-a151-70e4a8f963fc" path="/var/lib/kubelet/pods/92887968-fdd5-4653-a151-70e4a8f963fc/volumes" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.899878 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfbbf5b9-098a-4705-921b-8bff1b195af8" path="/var/lib/kubelet/pods/bfbbf5b9-098a-4705-921b-8bff1b195af8/volumes" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.900698 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf4780a-ca9d-46f8-8295-50e6154f70aa" path="/var/lib/kubelet/pods/caf4780a-ca9d-46f8-8295-50e6154f70aa/volumes" Oct 02 16:41:40 crc kubenswrapper[4882]: I1002 16:41:40.921358 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" (UID: "0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:40 crc kubenswrapper[4882]: E1002 16:41:40.951652 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 16:41:40 crc kubenswrapper[4882]: E1002 16:41:40.953634 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 16:41:40 crc kubenswrapper[4882]: E1002 16:41:40.966107 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 16:41:40 crc kubenswrapper[4882]: E1002 16:41:40.966181 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="ovn-northd" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.017139 4882 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.280360 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6733b4d-ebf1-43cd-9960-3c25fca82e64" (UID: "d6733b4d-ebf1-43cd-9960-3c25fca82e64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.329867 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b80a4ca1-90cc-4c29-a2de-13b4db198cef" (UID: "b80a4ca1-90cc-4c29-a2de-13b4db198cef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.335195 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.335252 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.385359 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data" (OuterVolumeSpecName: "config-data") pod "d6733b4d-ebf1-43cd-9960-3c25fca82e64" (UID: "d6733b4d-ebf1-43cd-9960-3c25fca82e64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.386059 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" (UID: "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.437583 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-config-data" (OuterVolumeSpecName: "config-data") pod "6537afcb-4015-45f6-bdb5-68e0625c6ea6" (UID: "6537afcb-4015-45f6-bdb5-68e0625c6ea6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.438001 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6733b4d-ebf1-43cd-9960-3c25fca82e64-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.438092 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.438166 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.511747 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" (UID: "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.526567 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-config-data" (OuterVolumeSpecName: "config-data") pod "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" (UID: "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.541258 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.541298 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.554179 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" (UID: "2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: E1002 16:41:41.574403 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:41 crc kubenswrapper[4882]: E1002 16:41:41.579175 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:41 crc kubenswrapper[4882]: E1002 16:41:41.580329 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:41 crc kubenswrapper[4882]: E1002 16:41:41.580357 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:41:41 crc kubenswrapper[4882]: E1002 16:41:41.582128 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:41 crc kubenswrapper[4882]: E1002 16:41:41.585233 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:41 crc kubenswrapper[4882]: E1002 16:41:41.589408 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:41 crc kubenswrapper[4882]: E1002 16:41:41.589443 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.592590 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6537afcb-4015-45f6-bdb5-68e0625c6ea6" (UID: "6537afcb-4015-45f6-bdb5-68e0625c6ea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.628880 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data" (OuterVolumeSpecName: "config-data") pod "b80a4ca1-90cc-4c29-a2de-13b4db198cef" (UID: "b80a4ca1-90cc-4c29-a2de-13b4db198cef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.641533 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6537afcb-4015-45f6-bdb5-68e0625c6ea6" (UID: "6537afcb-4015-45f6-bdb5-68e0625c6ea6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.642204 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-internal-tls-certs\") pod \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\" (UID: \"6537afcb-4015-45f6-bdb5-68e0625c6ea6\") " Oct 02 16:41:41 crc kubenswrapper[4882]: W1002 16:41:41.642697 4882 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6537afcb-4015-45f6-bdb5-68e0625c6ea6/volumes/kubernetes.io~secret/internal-tls-certs Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.642738 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6537afcb-4015-45f6-bdb5-68e0625c6ea6" (UID: "6537afcb-4015-45f6-bdb5-68e0625c6ea6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.643328 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.643352 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b80a4ca1-90cc-4c29-a2de-13b4db198cef-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.643366 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.643380 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.708549 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6537afcb-4015-45f6-bdb5-68e0625c6ea6" (UID: "6537afcb-4015-45f6-bdb5-68e0625c6ea6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.745244 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6537afcb-4015-45f6-bdb5-68e0625c6ea6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.800271 4882 generic.go:334] "Generic (PLEG): container finished" podID="6b9a74a6-19b0-4ab2-a047-9ff9c13137d7" containerID="17feefff1afafcdeb8858d5333569c764af86af1453576b2d47195609aa13105" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.805691 4882 generic.go:334] "Generic (PLEG): container finished" podID="bd2c7127-3671-4e79-aad9-01146803019e" containerID="7d2d4ce2c2e6521a23764d7d8191e9ba960363470732242e3f515311010cfa5e" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.810404 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cindera969-account-delete-pgcm7" podUID="bb7cef74-9b55-4213-a934-7c1d2c058aab" containerName="mariadb-account-delete" containerID="cri-o://e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a" gracePeriod=30 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.819435 4882 generic.go:334] "Generic (PLEG): container finished" podID="8c726379-ca3b-428c-8091-c1870692c652" containerID="7a5c517fb7be5d54d65188945a9d902fc3b262ce0971504dd877fd1176e8b9f9" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.845156 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cindera969-account-delete-pgcm7" podStartSLOduration=7.845134932 podStartE2EDuration="7.845134932s" podCreationTimestamp="2025-10-02 16:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 16:41:41.843134272 +0000 UTC m=+1460.592363799" watchObservedRunningTime="2025-10-02 16:41:41.845134932 +0000 UTC m=+1460.594364459" Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.852272 4882 generic.go:334] "Generic (PLEG): container finished" podID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerID="42bd27be6298749bedf886c1fdcb856b9c7bdde2ff094fd0bbcab807ca38fc43" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.855137 4882 generic.go:334] "Generic (PLEG): container finished" podID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerID="94742d7620ee6e9af50f0bdad8f628d680e7d044141c67f87aac9ef2bfa64a8b" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.855421 4882 generic.go:334] "Generic (PLEG): container finished" podID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerID="513b374180132a0f297498712d1e51739fc2ad0c26615a542f982ac395df9d3e" exitCode=2 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.855428 4882 generic.go:334] "Generic (PLEG): container finished" podID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerID="9a942b8c17e8463a8e6f76d1513de147a7fbccbc3c6bce75a888d236140df363" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.869369 4882 generic.go:334] "Generic (PLEG): container finished" podID="d7b05798-486d-493e-90eb-c09edb4bdc96" containerID="f281d92386df9b537dc64cd1c262d2a95b35ba4313d97c921f09342506f64857" exitCode=2 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.871079 4882 generic.go:334] "Generic (PLEG): container finished" podID="ce1313f5-3aed-43a9-881d-bf61353ab6bd" containerID="1f6b1ba588353fa4b48eb5a0e6cd51674da94c25662edd41a4ae4d86d2142347" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.874829 4882 generic.go:334] "Generic (PLEG): container finished" podID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerID="591e0b2c9d313cf5f0e413e1051aa9585b0f95b568ba3d808feaeffe7aa56741" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.883413 4882 generic.go:334] "Generic (PLEG): container finished" podID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerID="8b9f83a977cbb31784f8c0ea343e36bdd7f6e161aef6bbb69620207d95bcb979" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.884824 4882 generic.go:334] "Generic (PLEG): container finished" podID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerID="2d2f7b7472a6b146044b414ca3cbab36ecb87a0c178a2b350803295c0549c80d" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.903571 4882 generic.go:334] "Generic (PLEG): container finished" podID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerID="c0cade657e8d743e4938ba6a036b7b7b533715caf8cc87468d9eea57d5a85617" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.905086 4882 generic.go:334] "Generic (PLEG): container finished" podID="261d06a1-c07d-4430-9984-24531fa935c6" containerID="0dbcbc8ee3a0c5fdc9984a4b48b1e902f6d70fc302ca3885063927617a9dc73e" exitCode=0 Oct 02 16:41:41 crc kubenswrapper[4882]: I1002 16:41:41.908332 4882 generic.go:334] "Generic (PLEG): container finished" podID="6a85d34b-abed-4d9a-aa75-9781d96a4c8b" containerID="eb5c19ef9a3d208810e06c49f83b88419f649c9cf980c36180a7a19b801964dc" exitCode=0 Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.162521 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.162618 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data podName:ad4f3fde-e95f-404d-baac-1c6238494afa nodeName:}" failed. No retries permitted until 2025-10-02 16:41:50.162593501 +0000 UTC m=+1468.911823028 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data") pod "rabbitmq-cell1-server-0" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa") : configmap "rabbitmq-cell1-config-data" not found Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.188757 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.190066 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.191636 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.191687 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="b883970f-7b20-4f83-9b05-3b0469caf183" containerName="nova-cell0-conductor-conductor" Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.432406 4882 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.665s" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432446 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432472 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e8cf6351-a2e4-475f-a9f2-9006fee40049","Type":"ContainerDied","Data":"992c81299a04e5daff196308e63e462f970c84fba7f54d8f4c13a48c41b6e8cc"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432728 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="992c81299a04e5daff196308e63e462f970c84fba7f54d8f4c13a48c41b6e8cc" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432797 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican6e73-account-delete-49s2r" event={"ID":"6b9a74a6-19b0-4ab2-a047-9ff9c13137d7","Type":"ContainerDied","Data":"17feefff1afafcdeb8858d5333569c764af86af1453576b2d47195609aa13105"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432816 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pnsg9"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432826 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8x4zn"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432836 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bd2c7127-3671-4e79-aad9-01146803019e","Type":"ContainerDied","Data":"7d2d4ce2c2e6521a23764d7d8191e9ba960363470732242e3f515311010cfa5e"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432848 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bd2c7127-3671-4e79-aad9-01146803019e","Type":"ContainerDied","Data":"8dcaaec2f7f80279d6258eff133f23d505f66b42f0440989531113ab2d0a73c5"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432857 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dcaaec2f7f80279d6258eff133f23d505f66b42f0440989531113ab2d0a73c5" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432869 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pnsg9"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432886 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8x4zn"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432897 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7df8f99cc4-7tfd9"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432909 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0de09a9-9a37-4c03-abd4-002230d4f583","Type":"ContainerDied","Data":"60672cd4904b96bbaea11c03ffc6269842a669ec325af40afd317152dd8f413a"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432920 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60672cd4904b96bbaea11c03ffc6269842a669ec325af40afd317152dd8f413a" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432928 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432940 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-b49v2"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432949 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b416-account-create-2nr92"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432957 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindera969-account-delete-pgcm7" event={"ID":"bb7cef74-9b55-4213-a934-7c1d2c058aab","Type":"ContainerStarted","Data":"e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432967 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapie3bb-account-delete-wjz8t" event={"ID":"8c726379-ca3b-428c-8091-c1870692c652","Type":"ContainerDied","Data":"7a5c517fb7be5d54d65188945a9d902fc3b262ce0971504dd877fd1176e8b9f9"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432978 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68897cb7f8-5wgv6" event={"ID":"bb8a47fb-b921-4b63-9529-a49d1ec506fb","Type":"ContainerDied","Data":"42bd27be6298749bedf886c1fdcb856b9c7bdde2ff094fd0bbcab807ca38fc43"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.432992 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-b49v2"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433002 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b416-account-create-2nr92"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433392 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68897cb7f8-5wgv6" event={"ID":"bb8a47fb-b921-4b63-9529-a49d1ec506fb","Type":"ContainerDied","Data":"c1531f1c4ec303ed31f6714cd270c9a3cd43e44dcabbf282527f44b222d6cbb3"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433427 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1531f1c4ec303ed31f6714cd270c9a3cd43e44dcabbf282527f44b222d6cbb3" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433646 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerDied","Data":"94742d7620ee6e9af50f0bdad8f628d680e7d044141c67f87aac9ef2bfa64a8b"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433539 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7df8f99cc4-7tfd9" podUID="d5be6998-0b42-475a-8418-032327087ace" containerName="keystone-api" containerID="cri-o://02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398" gracePeriod=30 Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433712 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerDied","Data":"513b374180132a0f297498712d1e51739fc2ad0c26615a542f982ac395df9d3e"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433745 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerDied","Data":"9a942b8c17e8463a8e6f76d1513de147a7fbccbc3c6bce75a888d236140df363"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433761 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7b05798-486d-493e-90eb-c09edb4bdc96","Type":"ContainerDied","Data":"f281d92386df9b537dc64cd1c262d2a95b35ba4313d97c921f09342506f64857"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433772 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7b05798-486d-493e-90eb-c09edb4bdc96","Type":"ContainerDied","Data":"b1d9df38946f9d4ea04e197424306a3777d07e8fb20beef0551267a3070be30e"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433797 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1d9df38946f9d4ea04e197424306a3777d07e8fb20beef0551267a3070be30e" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433814 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancec9ca-account-delete-csrvn" event={"ID":"ce1313f5-3aed-43a9-881d-bf61353ab6bd","Type":"ContainerDied","Data":"1f6b1ba588353fa4b48eb5a0e6cd51674da94c25662edd41a4ae4d86d2142347"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433827 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzll4" event={"ID":"dc1acdb9-23dd-43cd-b568-0e6a04f0db71","Type":"ContainerDied","Data":"591e0b2c9d313cf5f0e413e1051aa9585b0f95b568ba3d808feaeffe7aa56741"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433840 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"af2f3ba0-9530-4875-84e1-df99cc4761a6","Type":"ContainerDied","Data":"8b9f83a977cbb31784f8c0ea343e36bdd7f6e161aef6bbb69620207d95bcb979"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433853 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5b37b99-d806-4dce-a73e-653f8ebc5567","Type":"ContainerDied","Data":"2d2f7b7472a6b146044b414ca3cbab36ecb87a0c178a2b350803295c0549c80d"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433866 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5b37b99-d806-4dce-a73e-653f8ebc5567","Type":"ContainerDied","Data":"43d253676362a72567b43a395ecee3faabb34fa32441161c49efe82223b1abb9"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433877 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d253676362a72567b43a395ecee3faabb34fa32441161c49efe82223b1abb9" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433887 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a33ca09-ff99-44fd-a978-ef69315caf26","Type":"ContainerDied","Data":"be82b60aebcd42d773671019413d9eac36d7c87b003ef838b1a35f586f897a59"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433898 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be82b60aebcd42d773671019413d9eac36d7c87b003ef838b1a35f586f897a59" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433906 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f","Type":"ContainerDied","Data":"c0cade657e8d743e4938ba6a036b7b7b533715caf8cc87468d9eea57d5a85617"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433917 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell050c5-account-delete-7t9wx" event={"ID":"261d06a1-c07d-4430-9984-24531fa935c6","Type":"ContainerDied","Data":"0dbcbc8ee3a0c5fdc9984a4b48b1e902f6d70fc302ca3885063927617a9dc73e"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.433928 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0481-account-delete-lt5bg" event={"ID":"6a85d34b-abed-4d9a-aa75-9781d96a4c8b","Type":"ContainerDied","Data":"eb5c19ef9a3d208810e06c49f83b88419f649c9cf980c36180a7a19b801964dc"} Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.434126 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" containerName="memcached" containerID="cri-o://2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2" gracePeriod=30 Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.463581 4882 scope.go:117] "RemoveContainer" containerID="1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed" Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.494333 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5 is running failed: container process not found" containerID="01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.496120 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5 is running failed: container process not found" containerID="01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.497737 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5 is running failed: container process not found" containerID="01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.497800 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="06e85b8e-fa65-4016-adbb-e72100f18388" containerName="nova-scheduler-scheduler" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.500173 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.505235 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.521029 4882 scope.go:117] "RemoveContainer" containerID="b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224" Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.526156 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224\": container with ID starting with b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224 not found: ID does not exist" containerID="b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.526268 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224"} err="failed to get container status \"b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224\": rpc error: code = NotFound desc = could not find container \"b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224\": container with ID starting with b1a611be2ed648257dae65544f407b9891a07e2cbe03f6796f23707cd42e0224 not found: ID does not exist" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.526308 4882 scope.go:117] "RemoveContainer" containerID="1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed" Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.532407 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed\": container with ID starting with 1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed not found: ID does not exist" containerID="1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.532820 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed"} err="failed to get container status \"1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed\": rpc error: code = NotFound desc = could not find container \"1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed\": container with ID starting with 1cc2b35c70088032eca6e6d9959a0a9348b7e01932ded661a6a3101371cca2ed not found: ID does not exist" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.532956 4882 scope.go:117] "RemoveContainer" containerID="2155a8ead6611495ac2656cb91200c865eee705b0367cd61131e9b9efc40fcda" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.537827 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569417 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a33ca09-ff99-44fd-a978-ef69315caf26-etc-machine-id\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569511 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-combined-ca-bundle\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569539 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data-custom\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569575 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-scripts\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569612 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-combined-ca-bundle\") pod \"e8cf6351-a2e4-475f-a9f2-9006fee40049\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569672 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-logs\") pod \"e8cf6351-a2e4-475f-a9f2-9006fee40049\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569703 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-internal-tls-certs\") pod \"e8cf6351-a2e4-475f-a9f2-9006fee40049\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569734 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a33ca09-ff99-44fd-a978-ef69315caf26-logs\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569763 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-internal-tls-certs\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569800 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569830 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-httpd-run\") pod \"e8cf6351-a2e4-475f-a9f2-9006fee40049\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569853 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25bt7\" (UniqueName: \"kubernetes.io/projected/e8cf6351-a2e4-475f-a9f2-9006fee40049-kube-api-access-25bt7\") pod \"e8cf6351-a2e4-475f-a9f2-9006fee40049\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569898 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"e8cf6351-a2e4-475f-a9f2-9006fee40049\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569928 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kj4z\" (UniqueName: \"kubernetes.io/projected/8a33ca09-ff99-44fd-a978-ef69315caf26-kube-api-access-8kj4z\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569956 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-scripts\") pod \"e8cf6351-a2e4-475f-a9f2-9006fee40049\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569987 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-public-tls-certs\") pod \"8a33ca09-ff99-44fd-a978-ef69315caf26\" (UID: \"8a33ca09-ff99-44fd-a978-ef69315caf26\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.570053 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-config-data\") pod \"e8cf6351-a2e4-475f-a9f2-9006fee40049\" (UID: \"e8cf6351-a2e4-475f-a9f2-9006fee40049\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.569553 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a33ca09-ff99-44fd-a978-ef69315caf26-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.578722 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "e8cf6351-a2e4-475f-a9f2-9006fee40049" (UID: "e8cf6351-a2e4-475f-a9f2-9006fee40049"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.583038 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8cf6351-a2e4-475f-a9f2-9006fee40049-kube-api-access-25bt7" (OuterVolumeSpecName: "kube-api-access-25bt7") pod "e8cf6351-a2e4-475f-a9f2-9006fee40049" (UID: "e8cf6351-a2e4-475f-a9f2-9006fee40049"). InnerVolumeSpecName "kube-api-access-25bt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.584228 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a33ca09-ff99-44fd-a978-ef69315caf26-logs" (OuterVolumeSpecName: "logs") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.584523 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8cf6351-a2e4-475f-a9f2-9006fee40049" (UID: "e8cf6351-a2e4-475f-a9f2-9006fee40049"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.586147 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-scripts" (OuterVolumeSpecName: "scripts") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.586874 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-logs" (OuterVolumeSpecName: "logs") pod "e8cf6351-a2e4-475f-a9f2-9006fee40049" (UID: "e8cf6351-a2e4-475f-a9f2-9006fee40049"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.589383 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a33ca09-ff99-44fd-a978-ef69315caf26-kube-api-access-8kj4z" (OuterVolumeSpecName: "kube-api-access-8kj4z") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "kube-api-access-8kj4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.593823 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.601567 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-scripts" (OuterVolumeSpecName: "scripts") pod "e8cf6351-a2e4-475f-a9f2-9006fee40049" (UID: "e8cf6351-a2e4-475f-a9f2-9006fee40049"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.613509 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8cf6351-a2e4-475f-a9f2-9006fee40049" (UID: "e8cf6351-a2e4-475f-a9f2-9006fee40049"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.616415 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.665916 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.672487 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data" (OuterVolumeSpecName: "config-data") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.674491 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e8cf6351-a2e4-475f-a9f2-9006fee40049" (UID: "e8cf6351-a2e4-475f-a9f2-9006fee40049"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675057 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gnrb\" (UniqueName: \"kubernetes.io/projected/bd2c7127-3671-4e79-aad9-01146803019e-kube-api-access-2gnrb\") pod \"bd2c7127-3671-4e79-aad9-01146803019e\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675168 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-config-data\") pod \"bd2c7127-3671-4e79-aad9-01146803019e\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675328 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-combined-ca-bundle\") pod \"bd2c7127-3671-4e79-aad9-01146803019e\" (UID: \"bd2c7127-3671-4e79-aad9-01146803019e\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675701 4882 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a33ca09-ff99-44fd-a978-ef69315caf26-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675713 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675721 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675729 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675738 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675746 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675754 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675762 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a33ca09-ff99-44fd-a978-ef69315caf26-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675772 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675781 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675788 4882 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8cf6351-a2e4-475f-a9f2-9006fee40049-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675796 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25bt7\" (UniqueName: \"kubernetes.io/projected/e8cf6351-a2e4-475f-a9f2-9006fee40049-kube-api-access-25bt7\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675815 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675824 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kj4z\" (UniqueName: \"kubernetes.io/projected/8a33ca09-ff99-44fd-a978-ef69315caf26-kube-api-access-8kj4z\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.675832 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.689874 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2c7127-3671-4e79-aad9-01146803019e-kube-api-access-2gnrb" (OuterVolumeSpecName: "kube-api-access-2gnrb") pod "bd2c7127-3671-4e79-aad9-01146803019e" (UID: "bd2c7127-3671-4e79-aad9-01146803019e"). InnerVolumeSpecName "kube-api-access-2gnrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.696872 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-config-data" (OuterVolumeSpecName: "config-data") pod "bd2c7127-3671-4e79-aad9-01146803019e" (UID: "bd2c7127-3671-4e79-aad9-01146803019e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.701269 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-config-data" (OuterVolumeSpecName: "config-data") pod "e8cf6351-a2e4-475f-a9f2-9006fee40049" (UID: "e8cf6351-a2e4-475f-a9f2-9006fee40049"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.704604 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.773526 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00110a10-c070-4648-b80d-33beb4a63b86" path="/var/lib/kubelet/pods/00110a10-c070-4648-b80d-33beb4a63b86/volumes" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.775319 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b9fc00-b554-4426-ba0e-7bcbef667297" path="/var/lib/kubelet/pods/09b9fc00-b554-4426-ba0e-7bcbef667297/volumes" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.779564 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d444aa-bebd-4e21-bff0-c6ad1be5fc18" path="/var/lib/kubelet/pods/10d444aa-bebd-4e21-bff0-c6ad1be5fc18/volumes" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.779567 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.779828 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gnrb\" (UniqueName: \"kubernetes.io/projected/bd2c7127-3671-4e79-aad9-01146803019e-kube-api-access-2gnrb\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.779854 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8cf6351-a2e4-475f-a9f2-9006fee40049-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.779870 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.779605 4882 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 16:41:42 crc kubenswrapper[4882]: E1002 16:41:42.780071 4882 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data podName:74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42 nodeName:}" failed. No retries permitted until 2025-10-02 16:41:50.780048912 +0000 UTC m=+1469.529278439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data") pod "rabbitmq-server-0" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42") : configmap "rabbitmq-config-data" not found Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.781063 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9308bac-63a2-4c6a-88b5-23d8083feac8" path="/var/lib/kubelet/pods/a9308bac-63a2-4c6a-88b5-23d8083feac8/volumes" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.783005 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd2c7127-3671-4e79-aad9-01146803019e" (UID: "bd2c7127-3671-4e79-aad9-01146803019e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.783889 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="fe1004bf-948f-4aae-b19b-a1321eab3b03" containerName="galera" containerID="cri-o://37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36" gracePeriod=30 Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.787075 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8a33ca09-ff99-44fd-a978-ef69315caf26" (UID: "8a33ca09-ff99-44fd-a978-ef69315caf26"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.854867 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.880901 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-568599c566-7t7js"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.882032 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.882607 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c7127-3671-4e79-aad9-01146803019e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.882646 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a33ca09-ff99-44fd-a978-ef69315caf26-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.891436 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-568599c566-7t7js"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.893001 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.905421 4882 scope.go:117] "RemoveContainer" containerID="c61dc991f9c47938a3b335bd886b13edcae91852caa89730fe80338f674f2a36" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.908091 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.925346 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-54f5c99697-qjljg"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.936811 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.973416 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-54f5c99697-qjljg"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.985073 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-scripts\") pod \"e0de09a9-9a37-4c03-abd4-002230d4f583\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.986199 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-config\") pod \"d7b05798-486d-493e-90eb-c09edb4bdc96\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.986344 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-config-data\") pod \"e0de09a9-9a37-4c03-abd4-002230d4f583\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.986450 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-combined-ca-bundle\") pod \"e0de09a9-9a37-4c03-abd4-002230d4f583\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.986530 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e0de09a9-9a37-4c03-abd4-002230d4f583\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.986691 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-public-tls-certs\") pod \"e0de09a9-9a37-4c03-abd4-002230d4f583\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.986788 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbl5n\" (UniqueName: \"kubernetes.io/projected/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-api-access-gbl5n\") pod \"d7b05798-486d-493e-90eb-c09edb4bdc96\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.986893 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-httpd-run\") pod \"e0de09a9-9a37-4c03-abd4-002230d4f583\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.987070 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-combined-ca-bundle\") pod \"d7b05798-486d-493e-90eb-c09edb4bdc96\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.987196 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-logs\") pod \"e0de09a9-9a37-4c03-abd4-002230d4f583\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.987300 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-certs\") pod \"d7b05798-486d-493e-90eb-c09edb4bdc96\" (UID: \"d7b05798-486d-493e-90eb-c09edb4bdc96\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.987407 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv9h9\" (UniqueName: \"kubernetes.io/projected/e0de09a9-9a37-4c03-abd4-002230d4f583-kube-api-access-gv9h9\") pod \"e0de09a9-9a37-4c03-abd4-002230d4f583\" (UID: \"e0de09a9-9a37-4c03-abd4-002230d4f583\") " Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.989156 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e0de09a9-9a37-4c03-abd4-002230d4f583" (UID: "e0de09a9-9a37-4c03-abd4-002230d4f583"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.990404 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-scripts" (OuterVolumeSpecName: "scripts") pod "e0de09a9-9a37-4c03-abd4-002230d4f583" (UID: "e0de09a9-9a37-4c03-abd4-002230d4f583"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.990662 4882 scope.go:117] "RemoveContainer" containerID="9e298188f83c2dcdfd495c92ac971fa8c62421680bfe7edca5990f87e39223ce" Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.991484 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-dccccfdb8-h8ffg"] Oct 02 16:41:42 crc kubenswrapper[4882]: I1002 16:41:42.992239 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-logs" (OuterVolumeSpecName: "logs") pod "e0de09a9-9a37-4c03-abd4-002230d4f583" (UID: "e0de09a9-9a37-4c03-abd4-002230d4f583"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.006516 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-dccccfdb8-h8ffg"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.009436 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e0de09a9-9a37-4c03-abd4-002230d4f583" (UID: "e0de09a9-9a37-4c03-abd4-002230d4f583"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.010616 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.010856 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.010135 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-api-access-gbl5n" (OuterVolumeSpecName: "kube-api-access-gbl5n") pod "d7b05798-486d-493e-90eb-c09edb4bdc96" (UID: "d7b05798-486d-493e-90eb-c09edb4bdc96"). InnerVolumeSpecName "kube-api-access-gbl5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.016188 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0de09a9-9a37-4c03-abd4-002230d4f583-kube-api-access-gv9h9" (OuterVolumeSpecName: "kube-api-access-gv9h9") pod "e0de09a9-9a37-4c03-abd4-002230d4f583" (UID: "e0de09a9-9a37-4c03-abd4-002230d4f583"). InnerVolumeSpecName "kube-api-access-gv9h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.016584 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.019516 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f","Type":"ContainerDied","Data":"f550241b15c95fcc625121707c03baa4fef5139fcf8590c21af01985fe802e52"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.030762 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.042543 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.045124 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0de09a9-9a37-4c03-abd4-002230d4f583" (UID: "e0de09a9-9a37-4c03-abd4-002230d4f583"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.048971 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-68fcbbc6fc-4n89r"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.055824 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapie3bb-account-delete-wjz8t" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.056044 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"af2f3ba0-9530-4875-84e1-df99cc4761a6","Type":"ContainerDied","Data":"a34587d375e63d5807bf75834c5061dd3c3db2e8d13809c6042eb23945306774"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.057593 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-68fcbbc6fc-4n89r"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.065233 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement52af-account-delete-gwjd7"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.074394 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "d7b05798-486d-493e-90eb-c09edb4bdc96" (UID: "d7b05798-486d-493e-90eb-c09edb4bdc96"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.074668 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement52af-account-delete-gwjd7"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.074974 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7b05798-486d-493e-90eb-c09edb4bdc96" (UID: "d7b05798-486d-493e-90eb-c09edb4bdc96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.075858 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell050c5-account-delete-7t9wx" event={"ID":"261d06a1-c07d-4430-9984-24531fa935c6","Type":"ContainerDied","Data":"c54c38cc5cd5edb23cbba0efa73190a8814f344f25cf5ef0ee52cc158cf4f02b"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.075903 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c54c38cc5cd5edb23cbba0efa73190a8814f344f25cf5ef0ee52cc158cf4f02b" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.076270 4882 scope.go:117] "RemoveContainer" containerID="59d76121f684bf647e82fb7939a01b0eef06443bf9081dd3691a4619a445253e" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.076731 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican6e73-account-delete-49s2r" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.082654 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzll4" event={"ID":"dc1acdb9-23dd-43cd-b568-0e6a04f0db71","Type":"ContainerStarted","Data":"affa33b2bf077d80aecdfb407f2de76819f10f06b17bbaa1e1f4ae1023900018"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.089802 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancec9ca-account-delete-csrvn" event={"ID":"ce1313f5-3aed-43a9-881d-bf61353ab6bd","Type":"ContainerDied","Data":"f53758632758ed8cde1e7558fff569434b2d069520d4ee62de24b7655f12d844"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090047 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f53758632758ed8cde1e7558fff569434b2d069520d4ee62de24b7655f12d844" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090716 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-combined-ca-bundle\") pod \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090769 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-public-tls-certs\") pod \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090800 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-combined-ca-bundle\") pod \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090840 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-public-tls-certs\") pod \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090872 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data\") pod \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090915 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-nova-metadata-tls-certs\") pod \"b5b37b99-d806-4dce-a73e-653f8ebc5567\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090946 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-logs\") pod \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090965 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdnfp\" (UniqueName: \"kubernetes.io/projected/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-kube-api-access-tdnfp\") pod \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.090985 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-internal-tls-certs\") pod \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091046 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data-custom\") pod \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091065 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-config-data\") pod \"b5b37b99-d806-4dce-a73e-653f8ebc5567\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091082 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b37b99-d806-4dce-a73e-653f8ebc5567-logs\") pod \"b5b37b99-d806-4dce-a73e-653f8ebc5567\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091106 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-config-data\") pod \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\" (UID: \"909dd4dd-0b5d-4b6b-b64a-7c59cd26256f\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091126 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-internal-tls-certs\") pod \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091184 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-combined-ca-bundle\") pod \"b5b37b99-d806-4dce-a73e-653f8ebc5567\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091221 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8a47fb-b921-4b63-9529-a49d1ec506fb-logs\") pod \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091253 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlp57\" (UniqueName: \"kubernetes.io/projected/bb8a47fb-b921-4b63-9529-a49d1ec506fb-kube-api-access-rlp57\") pod \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\" (UID: \"bb8a47fb-b921-4b63-9529-a49d1ec506fb\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.091751 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7hlh\" (UniqueName: \"kubernetes.io/projected/b5b37b99-d806-4dce-a73e-653f8ebc5567-kube-api-access-z7hlh\") pod \"b5b37b99-d806-4dce-a73e-653f8ebc5567\" (UID: \"b5b37b99-d806-4dce-a73e-653f8ebc5567\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092105 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092117 4882 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092128 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092150 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092160 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbl5n\" (UniqueName: \"kubernetes.io/projected/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-api-access-gbl5n\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092171 4882 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092179 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092190 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0de09a9-9a37-4c03-abd4-002230d4f583-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092198 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv9h9\" (UniqueName: \"kubernetes.io/projected/e0de09a9-9a37-4c03-abd4-002230d4f583-kube-api-access-gv9h9\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092604 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancec9ca-account-delete-csrvn" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.092974 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0de09a9-9a37-4c03-abd4-002230d4f583" (UID: "e0de09a9-9a37-4c03-abd4-002230d4f583"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.095646 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b37b99-d806-4dce-a73e-653f8ebc5567-logs" (OuterVolumeSpecName: "logs") pod "b5b37b99-d806-4dce-a73e-653f8ebc5567" (UID: "b5b37b99-d806-4dce-a73e-653f8ebc5567"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.101077 4882 generic.go:334] "Generic (PLEG): container finished" podID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerID="578c94adafe095435dbcb87f647011c96a388fb1edfe93dbe48b4ca0639d68b2" exitCode=0 Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.101548 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad4f3fde-e95f-404d-baac-1c6238494afa","Type":"ContainerDied","Data":"578c94adafe095435dbcb87f647011c96a388fb1edfe93dbe48b4ca0639d68b2"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.102717 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.109105 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-logs" (OuterVolumeSpecName: "logs") pod "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" (UID: "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.109837 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapie3bb-account-delete-wjz8t" event={"ID":"8c726379-ca3b-428c-8091-c1870692c652","Type":"ContainerDied","Data":"a75932169d93f7f6c0536bdd0582c27f913497562a2e6b65fd183bc8593d7af0"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.109865 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8a47fb-b921-4b63-9529-a49d1ec506fb-logs" (OuterVolumeSpecName: "logs") pod "bb8a47fb-b921-4b63-9529-a49d1ec506fb" (UID: "bb8a47fb-b921-4b63-9529-a49d1ec506fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.109988 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapie3bb-account-delete-wjz8t" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.111571 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0481-account-delete-lt5bg" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.112068 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb8a47fb-b921-4b63-9529-a49d1ec506fb" (UID: "bb8a47fb-b921-4b63-9529-a49d1ec506fb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.121334 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b37b99-d806-4dce-a73e-653f8ebc5567-kube-api-access-z7hlh" (OuterVolumeSpecName: "kube-api-access-z7hlh") pod "b5b37b99-d806-4dce-a73e-653f8ebc5567" (UID: "b5b37b99-d806-4dce-a73e-653f8ebc5567"). InnerVolumeSpecName "kube-api-access-z7hlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.121564 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8a47fb-b921-4b63-9529-a49d1ec506fb-kube-api-access-rlp57" (OuterVolumeSpecName: "kube-api-access-rlp57") pod "bb8a47fb-b921-4b63-9529-a49d1ec506fb" (UID: "bb8a47fb-b921-4b63-9529-a49d1ec506fb"). InnerVolumeSpecName "kube-api-access-rlp57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.136150 4882 scope.go:117] "RemoveContainer" containerID="f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.137431 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-config-data" (OuterVolumeSpecName: "config-data") pod "e0de09a9-9a37-4c03-abd4-002230d4f583" (UID: "e0de09a9-9a37-4c03-abd4-002230d4f583"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.139284 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell050c5-account-delete-7t9wx" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.144257 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-kube-api-access-tdnfp" (OuterVolumeSpecName: "kube-api-access-tdnfp") pod "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" (UID: "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f"). InnerVolumeSpecName "kube-api-access-tdnfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.144519 4882 generic.go:334] "Generic (PLEG): container finished" podID="06e85b8e-fa65-4016-adbb-e72100f18388" containerID="01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5" exitCode=0 Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.144602 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06e85b8e-fa65-4016-adbb-e72100f18388","Type":"ContainerDied","Data":"01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.144638 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06e85b8e-fa65-4016-adbb-e72100f18388","Type":"ContainerDied","Data":"14358c4b1a4ae01495b201e9e95620eb2076d624c76e8b67d55e6d525c3f8b16"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.144681 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.156585 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.158675 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron0481-account-delete-lt5bg" event={"ID":"6a85d34b-abed-4d9a-aa75-9781d96a4c8b","Type":"ContainerDied","Data":"c7ea1be20bb1a5e7036c38bd966626c7e2305fbd438fa7acd35f4d593c017765"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.158933 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron0481-account-delete-lt5bg" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.160352 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-config-data" (OuterVolumeSpecName: "config-data") pod "b5b37b99-d806-4dce-a73e-653f8ebc5567" (UID: "b5b37b99-d806-4dce-a73e-653f8ebc5567"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.174300 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.174377 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican6e73-account-delete-49s2r" event={"ID":"6b9a74a6-19b0-4ab2-a047-9ff9c13137d7","Type":"ContainerDied","Data":"16233244175ba86c65f647d97a08985f6039ba582598ad75192f27c67f742768"} Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.174490 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.174552 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.174587 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.174778 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican6e73-account-delete-49s2r" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.174937 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.175952 4882 scope.go:117] "RemoveContainer" containerID="03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.191299 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb8a47fb-b921-4b63-9529-a49d1ec506fb" (UID: "bb8a47fb-b921-4b63-9529-a49d1ec506fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203276 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-combined-ca-bundle\") pod \"af2f3ba0-9530-4875-84e1-df99cc4761a6\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203345 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfg96\" (UniqueName: \"kubernetes.io/projected/8c726379-ca3b-428c-8091-c1870692c652-kube-api-access-wfg96\") pod \"8c726379-ca3b-428c-8091-c1870692c652\" (UID: \"8c726379-ca3b-428c-8091-c1870692c652\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203399 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gvlz\" (UniqueName: \"kubernetes.io/projected/6b9a74a6-19b0-4ab2-a047-9ff9c13137d7-kube-api-access-4gvlz\") pod \"6b9a74a6-19b0-4ab2-a047-9ff9c13137d7\" (UID: \"6b9a74a6-19b0-4ab2-a047-9ff9c13137d7\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203434 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data\") pod \"af2f3ba0-9530-4875-84e1-df99cc4761a6\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203461 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data-custom\") pod \"af2f3ba0-9530-4875-84e1-df99cc4761a6\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203480 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpqxm\" (UniqueName: \"kubernetes.io/projected/06e85b8e-fa65-4016-adbb-e72100f18388-kube-api-access-cpqxm\") pod \"06e85b8e-fa65-4016-adbb-e72100f18388\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203527 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b7mv\" (UniqueName: \"kubernetes.io/projected/6a85d34b-abed-4d9a-aa75-9781d96a4c8b-kube-api-access-7b7mv\") pod \"6a85d34b-abed-4d9a-aa75-9781d96a4c8b\" (UID: \"6a85d34b-abed-4d9a-aa75-9781d96a4c8b\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203562 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-combined-ca-bundle\") pod \"06e85b8e-fa65-4016-adbb-e72100f18388\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203632 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-config-data\") pod \"06e85b8e-fa65-4016-adbb-e72100f18388\" (UID: \"06e85b8e-fa65-4016-adbb-e72100f18388\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203705 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnzs2\" (UniqueName: \"kubernetes.io/projected/ce1313f5-3aed-43a9-881d-bf61353ab6bd-kube-api-access-hnzs2\") pod \"ce1313f5-3aed-43a9-881d-bf61353ab6bd\" (UID: \"ce1313f5-3aed-43a9-881d-bf61353ab6bd\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203843 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-scripts\") pod \"af2f3ba0-9530-4875-84e1-df99cc4761a6\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203897 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpncj\" (UniqueName: \"kubernetes.io/projected/af2f3ba0-9530-4875-84e1-df99cc4761a6-kube-api-access-kpncj\") pod \"af2f3ba0-9530-4875-84e1-df99cc4761a6\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.203987 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af2f3ba0-9530-4875-84e1-df99cc4761a6-etc-machine-id\") pod \"af2f3ba0-9530-4875-84e1-df99cc4761a6\" (UID: \"af2f3ba0-9530-4875-84e1-df99cc4761a6\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204796 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlp57\" (UniqueName: \"kubernetes.io/projected/bb8a47fb-b921-4b63-9529-a49d1ec506fb-kube-api-access-rlp57\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204817 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7hlh\" (UniqueName: \"kubernetes.io/projected/b5b37b99-d806-4dce-a73e-653f8ebc5567-kube-api-access-z7hlh\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204828 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204837 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204850 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204862 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204871 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdnfp\" (UniqueName: \"kubernetes.io/projected/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-kube-api-access-tdnfp\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204881 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0de09a9-9a37-4c03-abd4-002230d4f583-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204890 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204901 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204910 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b37b99-d806-4dce-a73e-653f8ebc5567-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.204918 4882 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb8a47fb-b921-4b63-9529-a49d1ec506fb-logs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.218751 4882 scope.go:117] "RemoveContainer" containerID="f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f" Oct 02 16:41:43 crc kubenswrapper[4882]: E1002 16:41:43.222590 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f\": container with ID starting with f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f not found: ID does not exist" containerID="f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.222638 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f"} err="failed to get container status \"f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f\": rpc error: code = NotFound desc = could not find container \"f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f\": container with ID starting with f2a957160c182aca29ff3de2f149668f84001bb36683d28a8c196e2b1d69492f not found: ID does not exist" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.222670 4882 scope.go:117] "RemoveContainer" containerID="03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.223416 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af2f3ba0-9530-4875-84e1-df99cc4761a6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "af2f3ba0-9530-4875-84e1-df99cc4761a6" (UID: "af2f3ba0-9530-4875-84e1-df99cc4761a6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: E1002 16:41:43.225579 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d\": container with ID starting with 03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d not found: ID does not exist" containerID="03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.225607 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d"} err="failed to get container status \"03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d\": rpc error: code = NotFound desc = could not find container \"03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d\": container with ID starting with 03048675add5ee79a2b2ef2adb2dadc9f9f974570c944c98a03a93a087d3438d not found: ID does not exist" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.225628 4882 scope.go:117] "RemoveContainer" containerID="df07f33c6b476688e48e7aac7112b9a252910e43159a82f95d8b350ed2eaa481" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.231402 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c726379-ca3b-428c-8091-c1870692c652-kube-api-access-wfg96" (OuterVolumeSpecName: "kube-api-access-wfg96") pod "8c726379-ca3b-428c-8091-c1870692c652" (UID: "8c726379-ca3b-428c-8091-c1870692c652"). InnerVolumeSpecName "kube-api-access-wfg96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.235467 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bb8a47fb-b921-4b63-9529-a49d1ec506fb" (UID: "bb8a47fb-b921-4b63-9529-a49d1ec506fb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.239057 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9a74a6-19b0-4ab2-a047-9ff9c13137d7-kube-api-access-4gvlz" (OuterVolumeSpecName: "kube-api-access-4gvlz") pod "6b9a74a6-19b0-4ab2-a047-9ff9c13137d7" (UID: "6b9a74a6-19b0-4ab2-a047-9ff9c13137d7"). InnerVolumeSpecName "kube-api-access-4gvlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.239158 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2f3ba0-9530-4875-84e1-df99cc4761a6-kube-api-access-kpncj" (OuterVolumeSpecName: "kube-api-access-kpncj") pod "af2f3ba0-9530-4875-84e1-df99cc4761a6" (UID: "af2f3ba0-9530-4875-84e1-df99cc4761a6"). InnerVolumeSpecName "kube-api-access-kpncj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.241996 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce1313f5-3aed-43a9-881d-bf61353ab6bd-kube-api-access-hnzs2" (OuterVolumeSpecName: "kube-api-access-hnzs2") pod "ce1313f5-3aed-43a9-881d-bf61353ab6bd" (UID: "ce1313f5-3aed-43a9-881d-bf61353ab6bd"). InnerVolumeSpecName "kube-api-access-hnzs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.242200 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e85b8e-fa65-4016-adbb-e72100f18388-kube-api-access-cpqxm" (OuterVolumeSpecName: "kube-api-access-cpqxm") pod "06e85b8e-fa65-4016-adbb-e72100f18388" (UID: "06e85b8e-fa65-4016-adbb-e72100f18388"). InnerVolumeSpecName "kube-api-access-cpqxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.247719 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "af2f3ba0-9530-4875-84e1-df99cc4761a6" (UID: "af2f3ba0-9530-4875-84e1-df99cc4761a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.249041 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-scripts" (OuterVolumeSpecName: "scripts") pod "af2f3ba0-9530-4875-84e1-df99cc4761a6" (UID: "af2f3ba0-9530-4875-84e1-df99cc4761a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.261474 4882 scope.go:117] "RemoveContainer" containerID="130c4a62c2b640eaef7a5ead25af6e7d9449b34010c056f857dac87df15f4c2a" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.265113 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a85d34b-abed-4d9a-aa75-9781d96a4c8b-kube-api-access-7b7mv" (OuterVolumeSpecName: "kube-api-access-7b7mv") pod "6a85d34b-abed-4d9a-aa75-9781d96a4c8b" (UID: "6a85d34b-abed-4d9a-aa75-9781d96a4c8b"). InnerVolumeSpecName "kube-api-access-7b7mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.276290 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hzll4" podStartSLOduration=6.142054397 podStartE2EDuration="9.276263383s" podCreationTimestamp="2025-10-02 16:41:34 +0000 UTC" firstStartedPulling="2025-10-02 16:41:38.206349306 +0000 UTC m=+1456.955578833" lastFinishedPulling="2025-10-02 16:41:41.340558292 +0000 UTC m=+1460.089787819" observedRunningTime="2025-10-02 16:41:43.24482791 +0000 UTC m=+1461.994057447" watchObservedRunningTime="2025-10-02 16:41:43.276263383 +0000 UTC m=+1462.025492930" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.277194 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-config-data" (OuterVolumeSpecName: "config-data") pod "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" (UID: "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.307143 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rdht\" (UniqueName: \"kubernetes.io/projected/261d06a1-c07d-4430-9984-24531fa935c6-kube-api-access-9rdht\") pod \"261d06a1-c07d-4430-9984-24531fa935c6\" (UID: \"261d06a1-c07d-4430-9984-24531fa935c6\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308499 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gvlz\" (UniqueName: \"kubernetes.io/projected/6b9a74a6-19b0-4ab2-a047-9ff9c13137d7-kube-api-access-4gvlz\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308515 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308524 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpqxm\" (UniqueName: \"kubernetes.io/projected/06e85b8e-fa65-4016-adbb-e72100f18388-kube-api-access-cpqxm\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308533 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b7mv\" (UniqueName: \"kubernetes.io/projected/6a85d34b-abed-4d9a-aa75-9781d96a4c8b-kube-api-access-7b7mv\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308541 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308552 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnzs2\" (UniqueName: \"kubernetes.io/projected/ce1313f5-3aed-43a9-881d-bf61353ab6bd-kube-api-access-hnzs2\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308560 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308569 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpncj\" (UniqueName: \"kubernetes.io/projected/af2f3ba0-9530-4875-84e1-df99cc4761a6-kube-api-access-kpncj\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308577 4882 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af2f3ba0-9530-4875-84e1-df99cc4761a6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308592 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.308601 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfg96\" (UniqueName: \"kubernetes.io/projected/8c726379-ca3b-428c-8091-c1870692c652-kube-api-access-wfg96\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.341918 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261d06a1-c07d-4430-9984-24531fa935c6-kube-api-access-9rdht" (OuterVolumeSpecName: "kube-api-access-9rdht") pod "261d06a1-c07d-4430-9984-24531fa935c6" (UID: "261d06a1-c07d-4430-9984-24531fa935c6"). InnerVolumeSpecName "kube-api-access-9rdht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.342046 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" (UID: "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.342425 4882 scope.go:117] "RemoveContainer" containerID="ce86d9d2458ffecb5f7269b0144634e7551d8db7396b16d935a1e65eea3b1726" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.355126 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.385342 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.385338 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5b37b99-d806-4dce-a73e-653f8ebc5567" (UID: "b5b37b99-d806-4dce-a73e-653f8ebc5567"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.386011 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b5b37b99-d806-4dce-a73e-653f8ebc5567" (UID: "b5b37b99-d806-4dce-a73e-653f8ebc5567"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.387266 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" (UID: "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.388175 4882 scope.go:117] "RemoveContainer" containerID="29253f1351cb8c376d642b9c868a7dc63855121b01b0705c9179056a230e6ebf" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.401936 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data" (OuterVolumeSpecName: "config-data") pod "bb8a47fb-b921-4b63-9529-a49d1ec506fb" (UID: "bb8a47fb-b921-4b63-9529-a49d1ec506fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.404312 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06e85b8e-fa65-4016-adbb-e72100f18388" (UID: "06e85b8e-fa65-4016-adbb-e72100f18388"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.413281 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.413317 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.413334 4882 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.413348 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rdht\" (UniqueName: \"kubernetes.io/projected/261d06a1-c07d-4430-9984-24531fa935c6-kube-api-access-9rdht\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.413359 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.413370 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b37b99-d806-4dce-a73e-653f8ebc5567-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.413391 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.441101 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-config-data" (OuterVolumeSpecName: "config-data") pod "06e85b8e-fa65-4016-adbb-e72100f18388" (UID: "06e85b8e-fa65-4016-adbb-e72100f18388"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.449564 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "d7b05798-486d-493e-90eb-c09edb4bdc96" (UID: "d7b05798-486d-493e-90eb-c09edb4bdc96"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.456519 4882 scope.go:117] "RemoveContainer" containerID="c0cade657e8d743e4938ba6a036b7b7b533715caf8cc87468d9eea57d5a85617" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.456932 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" (UID: "909dd4dd-0b5d-4b6b-b64a-7c59cd26256f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.479307 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.507461 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.507522 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.507797 4882 scope.go:117] "RemoveContainer" containerID="63647f011c653d8f4af95c4fa30a2a264c7bf4c99639ef5a76ed2e9bca13fa93" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.511783 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af2f3ba0-9530-4875-84e1-df99cc4761a6" (UID: "af2f3ba0-9530-4875-84e1-df99cc4761a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.515662 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.515696 4882 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b05798-486d-493e-90eb-c09edb4bdc96-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.515709 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e85b8e-fa65-4016-adbb-e72100f18388-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.515721 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.522987 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bb8a47fb-b921-4b63-9529-a49d1ec506fb" (UID: "bb8a47fb-b921-4b63-9529-a49d1ec506fb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.525931 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.554739 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.554787 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.565320 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapie3bb-account-delete-wjz8t"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.576652 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapie3bb-account-delete-wjz8t"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.582040 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.589359 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.593495 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.625472 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb8a47fb-b921-4b63-9529-a49d1ec506fb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.658290 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data" (OuterVolumeSpecName: "config-data") pod "af2f3ba0-9530-4875-84e1-df99cc4761a6" (UID: "af2f3ba0-9530-4875-84e1-df99cc4761a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732172 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-556fs\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-kube-api-access-556fs\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732234 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-plugins-conf\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732256 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-server-conf\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732318 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad4f3fde-e95f-404d-baac-1c6238494afa-erlang-cookie-secret\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732350 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad4f3fde-e95f-404d-baac-1c6238494afa-pod-info\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732370 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-tls\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732399 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-erlang-cookie\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732443 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732469 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-plugins\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732493 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732508 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-confd\") pod \"ad4f3fde-e95f-404d-baac-1c6238494afa\" (UID: \"ad4f3fde-e95f-404d-baac-1c6238494afa\") " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.732797 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af2f3ba0-9530-4875-84e1-df99cc4761a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.733710 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.734626 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.736061 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.753871 4882 scope.go:117] "RemoveContainer" containerID="08bb0f006c953ee6edc2f62804afed6ad5278a9319ec73877534ab1ea24041e5" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.769820 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-kube-api-access-556fs" (OuterVolumeSpecName: "kube-api-access-556fs") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "kube-api-access-556fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.769903 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.773444 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ad4f3fde-e95f-404d-baac-1c6238494afa-pod-info" (OuterVolumeSpecName: "pod-info") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.792598 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad4f3fde-e95f-404d-baac-1c6238494afa-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.795661 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.833946 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data" (OuterVolumeSpecName: "config-data") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835036 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-556fs\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-kube-api-access-556fs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835637 4882 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835653 4882 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad4f3fde-e95f-404d-baac-1c6238494afa-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835664 4882 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad4f3fde-e95f-404d-baac-1c6238494afa-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835675 4882 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835687 4882 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835710 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835723 4882 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.835736 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.852246 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-server-conf" (OuterVolumeSpecName: "server-conf") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.878502 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.888527 4882 scope.go:117] "RemoveContainer" containerID="8b9f83a977cbb31784f8c0ea343e36bdd7f6e161aef6bbb69620207d95bcb979" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.888823 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.895059 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.904828 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.916198 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron0481-account-delete-lt5bg"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.920974 4882 scope.go:117] "RemoveContainer" containerID="7a5c517fb7be5d54d65188945a9d902fc3b262ce0971504dd877fd1176e8b9f9" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.921945 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron0481-account-delete-lt5bg"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.946180 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.946329 4882 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad4f3fde-e95f-404d-baac-1c6238494afa-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.953860 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican6e73-account-delete-49s2r"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.962011 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican6e73-account-delete-49s2r"] Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.972381 4882 scope.go:117] "RemoveContainer" containerID="01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5" Oct 02 16:41:43 crc kubenswrapper[4882]: I1002 16:41:43.984440 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ad4f3fde-e95f-404d-baac-1c6238494afa" (UID: "ad4f3fde-e95f-404d-baac-1c6238494afa"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.047852 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-memcached-tls-certs\") pod \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.047965 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-config-data\") pod \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.048023 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-combined-ca-bundle\") pod \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.048110 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg57d\" (UniqueName: \"kubernetes.io/projected/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kube-api-access-bg57d\") pod \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.048171 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kolla-config\") pod \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\" (UID: \"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.048565 4882 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad4f3fde-e95f-404d-baac-1c6238494afa-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.049021 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" (UID: "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.049197 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-config-data" (OuterVolumeSpecName: "config-data") pod "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" (UID: "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.051646 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kube-api-access-bg57d" (OuterVolumeSpecName: "kube-api-access-bg57d") pod "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" (UID: "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b"). InnerVolumeSpecName "kube-api-access-bg57d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.088545 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" (UID: "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.093735 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" (UID: "51ada43e-a36e-49c7-bc9e-6c3151d2eb6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.129772 4882 scope.go:117] "RemoveContainer" containerID="01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5" Oct 02 16:41:44 crc kubenswrapper[4882]: E1002 16:41:44.130310 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5\": container with ID starting with 01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5 not found: ID does not exist" containerID="01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.130348 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5"} err="failed to get container status \"01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5\": rpc error: code = NotFound desc = could not find container \"01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5\": container with ID starting with 01e70baa4ab97143a1e7bf03a073e2d83d8ff40f522cc03a7418b0c435228ba5 not found: ID does not exist" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.130376 4882 scope.go:117] "RemoveContainer" containerID="eb5c19ef9a3d208810e06c49f83b88419f649c9cf980c36180a7a19b801964dc" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.150273 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.150311 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg57d\" (UniqueName: \"kubernetes.io/projected/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kube-api-access-bg57d\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.150325 4882 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.150337 4882 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.150349 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.172694 4882 scope.go:117] "RemoveContainer" containerID="17feefff1afafcdeb8858d5333569c764af86af1453576b2d47195609aa13105" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.208669 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_05fb59c5-aa61-4ec3-866f-3a4551737f80/ovn-northd/0.log" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.208729 4882 generic.go:334] "Generic (PLEG): container finished" podID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerID="cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" exitCode=139 Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.208827 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"05fb59c5-aa61-4ec3-866f-3a4551737f80","Type":"ContainerDied","Data":"cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4"} Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.233715 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ad4f3fde-e95f-404d-baac-1c6238494afa","Type":"ContainerDied","Data":"9511454e831a71a2a3e763a37979a48a0a41b701f447b13d0e58e793d20ded8c"} Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.233809 4882 scope.go:117] "RemoveContainer" containerID="578c94adafe095435dbcb87f647011c96a388fb1edfe93dbe48b4ca0639d68b2" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.233741 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.239475 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.245920 4882 generic.go:334] "Generic (PLEG): container finished" podID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerID="5681a9da320379f932adf71053bca8a31e077d6a1f4a09e0b83765c5625e4ade" exitCode=0 Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.246007 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42","Type":"ContainerDied","Data":"5681a9da320379f932adf71053bca8a31e077d6a1f4a09e0b83765c5625e4ade"} Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.250158 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.261899 4882 scope.go:117] "RemoveContainer" containerID="0d67f1f0413558904549638569857532409de73c996c52a84ea4614ae2ae72a3" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.275010 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.277574 4882 generic.go:334] "Generic (PLEG): container finished" podID="51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" containerID="2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2" exitCode=0 Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.278470 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.282540 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell050c5-account-delete-7t9wx" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.284087 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b","Type":"ContainerDied","Data":"2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2"} Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.284266 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"51ada43e-a36e-49c7-bc9e-6c3151d2eb6b","Type":"ContainerDied","Data":"ffd18c2b01e6b06ae1a6a1265915fa190763d51c676bef284a24d42044e3b515"} Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.284261 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68897cb7f8-5wgv6" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.284322 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancec9ca-account-delete-csrvn" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.284364 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.321108 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.338864 4882 scope.go:117] "RemoveContainer" containerID="2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.340112 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.370286 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.389182 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_05fb59c5-aa61-4ec3-866f-3a4551737f80/ovn-northd/0.log" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.389300 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.403558 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.403620 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.432054 4882 scope.go:117] "RemoveContainer" containerID="2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2" Oct 02 16:41:44 crc kubenswrapper[4882]: E1002 16:41:44.433754 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2\": container with ID starting with 2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2 not found: ID does not exist" containerID="2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.433812 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2"} err="failed to get container status \"2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2\": rpc error: code = NotFound desc = could not find container \"2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2\": container with ID starting with 2830fc3c252afc6b407a10ca884e30a2fd8d21d06e9c089ebeb9c5249931c2a2 not found: ID does not exist" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.447490 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.464258 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-rundir\") pod \"05fb59c5-aa61-4ec3-866f-3a4551737f80\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.464335 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs5nk\" (UniqueName: \"kubernetes.io/projected/05fb59c5-aa61-4ec3-866f-3a4551737f80-kube-api-access-xs5nk\") pod \"05fb59c5-aa61-4ec3-866f-3a4551737f80\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.464464 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-metrics-certs-tls-certs\") pod \"05fb59c5-aa61-4ec3-866f-3a4551737f80\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.464518 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-scripts\") pod \"05fb59c5-aa61-4ec3-866f-3a4551737f80\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.464580 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-config\") pod \"05fb59c5-aa61-4ec3-866f-3a4551737f80\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.464603 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-northd-tls-certs\") pod \"05fb59c5-aa61-4ec3-866f-3a4551737f80\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.464644 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-combined-ca-bundle\") pod \"05fb59c5-aa61-4ec3-866f-3a4551737f80\" (UID: \"05fb59c5-aa61-4ec3-866f-3a4551737f80\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.464929 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "05fb59c5-aa61-4ec3-866f-3a4551737f80" (UID: "05fb59c5-aa61-4ec3-866f-3a4551737f80"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.465046 4882 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.465705 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-scripts" (OuterVolumeSpecName: "scripts") pod "05fb59c5-aa61-4ec3-866f-3a4551737f80" (UID: "05fb59c5-aa61-4ec3-866f-3a4551737f80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.473960 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-config" (OuterVolumeSpecName: "config") pod "05fb59c5-aa61-4ec3-866f-3a4551737f80" (UID: "05fb59c5-aa61-4ec3-866f-3a4551737f80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.474463 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.474654 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.478177 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05fb59c5-aa61-4ec3-866f-3a4551737f80-kube-api-access-xs5nk" (OuterVolumeSpecName: "kube-api-access-xs5nk") pod "05fb59c5-aa61-4ec3-866f-3a4551737f80" (UID: "05fb59c5-aa61-4ec3-866f-3a4551737f80"). InnerVolumeSpecName "kube-api-access-xs5nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.484682 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.487937 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.503201 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell050c5-account-delete-7t9wx"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.520135 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05fb59c5-aa61-4ec3-866f-3a4551737f80" (UID: "05fb59c5-aa61-4ec3-866f-3a4551737f80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.520867 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell050c5-account-delete-7t9wx"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.526891 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.528653 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.534382 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.553719 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-68897cb7f8-5wgv6"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566193 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "05fb59c5-aa61-4ec3-866f-3a4551737f80" (UID: "05fb59c5-aa61-4ec3-866f-3a4551737f80"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566377 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-erlang-cookie\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566449 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-pod-info\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566471 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-erlang-cookie-secret\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566554 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-plugins-conf\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566577 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-server-conf\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566605 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-tls\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566629 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-plugins\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566671 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566689 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kstzg\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-kube-api-access-kstzg\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566709 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.566763 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-confd\") pod \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\" (UID: \"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42\") " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.580873 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-68897cb7f8-5wgv6"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.580937 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancec9ca-account-delete-csrvn"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.589877 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.589888 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.589985 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs5nk\" (UniqueName: \"kubernetes.io/projected/05fb59c5-aa61-4ec3-866f-3a4551737f80-kube-api-access-xs5nk\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.590004 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.590016 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05fb59c5-aa61-4ec3-866f-3a4551737f80-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.590028 4882 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.590474 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.594432 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.594541 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.595055 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-kube-api-access-kstzg" (OuterVolumeSpecName: "kube-api-access-kstzg") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "kube-api-access-kstzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.598477 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.598582 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-pod-info" (OuterVolumeSpecName: "pod-info") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.600205 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "05fb59c5-aa61-4ec3-866f-3a4551737f80" (UID: "05fb59c5-aa61-4ec3-866f-3a4551737f80"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.602496 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data" (OuterVolumeSpecName: "config-data") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.608634 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.610539 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancec9ca-account-delete-csrvn"] Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.643984 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-server-conf" (OuterVolumeSpecName: "server-conf") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691309 4882 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691343 4882 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691353 4882 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691361 4882 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691369 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691378 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kstzg\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-kube-api-access-kstzg\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691409 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691418 4882 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691427 4882 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05fb59c5-aa61-4ec3-866f-3a4551737f80-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691436 4882 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.691443 4882 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.706490 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" (UID: "74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.713553 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.778074 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e85b8e-fa65-4016-adbb-e72100f18388" path="/var/lib/kubelet/pods/06e85b8e-fa65-4016-adbb-e72100f18388/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.778760 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" path="/var/lib/kubelet/pods/0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.779261 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261d06a1-c07d-4430-9984-24531fa935c6" path="/var/lib/kubelet/pods/261d06a1-c07d-4430-9984-24531fa935c6/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.789449 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" path="/var/lib/kubelet/pods/2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.790372 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" path="/var/lib/kubelet/pods/51ada43e-a36e-49c7-bc9e-6c3151d2eb6b/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.791127 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" path="/var/lib/kubelet/pods/6537afcb-4015-45f6-bdb5-68e0625c6ea6/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.792166 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a85d34b-abed-4d9a-aa75-9781d96a4c8b" path="/var/lib/kubelet/pods/6a85d34b-abed-4d9a-aa75-9781d96a4c8b/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.792559 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.792579 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9a74a6-19b0-4ab2-a047-9ff9c13137d7" path="/var/lib/kubelet/pods/6b9a74a6-19b0-4ab2-a047-9ff9c13137d7/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.792586 4882 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.793031 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" path="/var/lib/kubelet/pods/8a33ca09-ff99-44fd-a978-ef69315caf26/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.794314 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c726379-ca3b-428c-8091-c1870692c652" path="/var/lib/kubelet/pods/8c726379-ca3b-428c-8091-c1870692c652/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.794799 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" path="/var/lib/kubelet/pods/909dd4dd-0b5d-4b6b-b64a-7c59cd26256f/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.795935 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4f3fde-e95f-404d-baac-1c6238494afa" path="/var/lib/kubelet/pods/ad4f3fde-e95f-404d-baac-1c6238494afa/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.796929 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" path="/var/lib/kubelet/pods/af2f3ba0-9530-4875-84e1-df99cc4761a6/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.797886 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b156c0c6-4395-4609-8260-5ee8943d6813" path="/var/lib/kubelet/pods/b156c0c6-4395-4609-8260-5ee8943d6813/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.799352 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" path="/var/lib/kubelet/pods/b5b37b99-d806-4dce-a73e-653f8ebc5567/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.800157 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" path="/var/lib/kubelet/pods/b80a4ca1-90cc-4c29-a2de-13b4db198cef/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.801202 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" path="/var/lib/kubelet/pods/bb8a47fb-b921-4b63-9529-a49d1ec506fb/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.802403 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2c7127-3671-4e79-aad9-01146803019e" path="/var/lib/kubelet/pods/bd2c7127-3671-4e79-aad9-01146803019e/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.803162 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce1313f5-3aed-43a9-881d-bf61353ab6bd" path="/var/lib/kubelet/pods/ce1313f5-3aed-43a9-881d-bf61353ab6bd/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.803926 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" path="/var/lib/kubelet/pods/d6733b4d-ebf1-43cd-9960-3c25fca82e64/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.805172 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b05798-486d-493e-90eb-c09edb4bdc96" path="/var/lib/kubelet/pods/d7b05798-486d-493e-90eb-c09edb4bdc96/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.805781 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ac3371-e1ab-4d5d-a543-9b9b68a0118a" path="/var/lib/kubelet/pods/e0ac3371-e1ab-4d5d-a543-9b9b68a0118a/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.806384 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" path="/var/lib/kubelet/pods/e0de09a9-9a37-4c03-abd4-002230d4f583/volumes" Oct 02 16:41:44 crc kubenswrapper[4882]: I1002 16:41:44.807958 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" path="/var/lib/kubelet/pods/e8cf6351-a2e4-475f-a9f2-9006fee40049/volumes" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.290093 4882 generic.go:334] "Generic (PLEG): container finished" podID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerID="1f23372a0ffce94576fa7b8b12f78b5723d92783fb82f3c3a11c09be97afc71e" exitCode=0 Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.290140 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerDied","Data":"1f23372a0ffce94576fa7b8b12f78b5723d92783fb82f3c3a11c09be97afc71e"} Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.290178 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dd77366-9ed5-4258-b61c-70c6cd95d5c6","Type":"ContainerDied","Data":"ebdbe5ad9cdb46e7cd1007f3d75872a47746b63bf654406a4ff0504331a5d4ad"} Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.290222 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebdbe5ad9cdb46e7cd1007f3d75872a47746b63bf654406a4ff0504331a5d4ad" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.297040 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_05fb59c5-aa61-4ec3-866f-3a4551737f80/ovn-northd/0.log" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.297154 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"05fb59c5-aa61-4ec3-866f-3a4551737f80","Type":"ContainerDied","Data":"d69b7725b81aab41803daf13609b395e712fe9ae5c1d63f02a9c22bbf377d26e"} Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.297171 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.297227 4882 scope.go:117] "RemoveContainer" containerID="6f83cafdb181c103837b92a2bd78dedec85779409ecdc413b2e067e64c5babcf" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.300859 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.303163 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42","Type":"ContainerDied","Data":"27e111b66dcfa7e5cd3f137f243ac139b62161da38921f13664b42ce7030ba60"} Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.303250 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.326285 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.332512 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.333415 4882 scope.go:117] "RemoveContainer" containerID="cd5ba23a4a3f1c0666544f3c9ff2f4af1f1dbe5cddfbda9ee228b394145232f4" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.380850 4882 scope.go:117] "RemoveContainer" containerID="5681a9da320379f932adf71053bca8a31e077d6a1f4a09e0b83765c5625e4ade" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.386464 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.393070 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.399392 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-scripts\") pod \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.399481 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-config-data\") pod \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.399539 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-run-httpd\") pod \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.399601 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-log-httpd\") pod \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.399640 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssk9g\" (UniqueName: \"kubernetes.io/projected/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-kube-api-access-ssk9g\") pod \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.399687 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-sg-core-conf-yaml\") pod \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.399735 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-ceilometer-tls-certs\") pod \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.400159 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1dd77366-9ed5-4258-b61c-70c6cd95d5c6" (UID: "1dd77366-9ed5-4258-b61c-70c6cd95d5c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.400557 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1dd77366-9ed5-4258-b61c-70c6cd95d5c6" (UID: "1dd77366-9ed5-4258-b61c-70c6cd95d5c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.400741 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-combined-ca-bundle\") pod \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\" (UID: \"1dd77366-9ed5-4258-b61c-70c6cd95d5c6\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.401315 4882 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.401343 4882 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.404047 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-kube-api-access-ssk9g" (OuterVolumeSpecName: "kube-api-access-ssk9g") pod "1dd77366-9ed5-4258-b61c-70c6cd95d5c6" (UID: "1dd77366-9ed5-4258-b61c-70c6cd95d5c6"). InnerVolumeSpecName "kube-api-access-ssk9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.404402 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-scripts" (OuterVolumeSpecName: "scripts") pod "1dd77366-9ed5-4258-b61c-70c6cd95d5c6" (UID: "1dd77366-9ed5-4258-b61c-70c6cd95d5c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.437953 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1dd77366-9ed5-4258-b61c-70c6cd95d5c6" (UID: "1dd77366-9ed5-4258-b61c-70c6cd95d5c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.451809 4882 scope.go:117] "RemoveContainer" containerID="56f5e136c1bbeef542d74f85786e2eb58e6cd2e6742745adf39c24f6ed827f4d" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.481846 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1dd77366-9ed5-4258-b61c-70c6cd95d5c6" (UID: "1dd77366-9ed5-4258-b61c-70c6cd95d5c6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.502644 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssk9g\" (UniqueName: \"kubernetes.io/projected/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-kube-api-access-ssk9g\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.502688 4882 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.502704 4882 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.502716 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.503585 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-config-data" (OuterVolumeSpecName: "config-data") pod "1dd77366-9ed5-4258-b61c-70c6cd95d5c6" (UID: "1dd77366-9ed5-4258-b61c-70c6cd95d5c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.514736 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dd77366-9ed5-4258-b61c-70c6cd95d5c6" (UID: "1dd77366-9ed5-4258-b61c-70c6cd95d5c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.604791 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.604828 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd77366-9ed5-4258-b61c-70c6cd95d5c6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.743972 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808009 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-kolla-config\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808331 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-galera-tls-certs\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808391 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808451 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-combined-ca-bundle\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808475 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hwpn\" (UniqueName: \"kubernetes.io/projected/fe1004bf-948f-4aae-b19b-a1321eab3b03-kube-api-access-8hwpn\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808559 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-secrets\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808578 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-operator-scripts\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808598 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-default\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808636 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-generated\") pod \"fe1004bf-948f-4aae-b19b-a1321eab3b03\" (UID: \"fe1004bf-948f-4aae-b19b-a1321eab3b03\") " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.808784 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.809272 4882 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.809632 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.809840 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.810003 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.830761 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-secrets" (OuterVolumeSpecName: "secrets") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.830782 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe1004bf-948f-4aae-b19b-a1321eab3b03-kube-api-access-8hwpn" (OuterVolumeSpecName: "kube-api-access-8hwpn") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "kube-api-access-8hwpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.835620 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.844140 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.866311 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "fe1004bf-948f-4aae-b19b-a1321eab3b03" (UID: "fe1004bf-948f-4aae-b19b-a1321eab3b03"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.910573 4882 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.910618 4882 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.910634 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.910646 4882 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fe1004bf-948f-4aae-b19b-a1321eab3b03-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.910658 4882 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.910683 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.910696 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1004bf-948f-4aae-b19b-a1321eab3b03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.910707 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hwpn\" (UniqueName: \"kubernetes.io/projected/fe1004bf-948f-4aae-b19b-a1321eab3b03-kube-api-access-8hwpn\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:45 crc kubenswrapper[4882]: I1002 16:41:45.944900 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.011632 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.045526 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.112467 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrrcs\" (UniqueName: \"kubernetes.io/projected/d5be6998-0b42-475a-8418-032327087ace-kube-api-access-mrrcs\") pod \"d5be6998-0b42-475a-8418-032327087ace\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.112537 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-config-data\") pod \"d5be6998-0b42-475a-8418-032327087ace\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.112567 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-scripts\") pod \"d5be6998-0b42-475a-8418-032327087ace\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.112588 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-fernet-keys\") pod \"d5be6998-0b42-475a-8418-032327087ace\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.112612 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-public-tls-certs\") pod \"d5be6998-0b42-475a-8418-032327087ace\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.112655 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-combined-ca-bundle\") pod \"d5be6998-0b42-475a-8418-032327087ace\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.112689 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-internal-tls-certs\") pod \"d5be6998-0b42-475a-8418-032327087ace\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.112716 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-credential-keys\") pod \"d5be6998-0b42-475a-8418-032327087ace\" (UID: \"d5be6998-0b42-475a-8418-032327087ace\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.117897 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5be6998-0b42-475a-8418-032327087ace-kube-api-access-mrrcs" (OuterVolumeSpecName: "kube-api-access-mrrcs") pod "d5be6998-0b42-475a-8418-032327087ace" (UID: "d5be6998-0b42-475a-8418-032327087ace"). InnerVolumeSpecName "kube-api-access-mrrcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.119333 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d5be6998-0b42-475a-8418-032327087ace" (UID: "d5be6998-0b42-475a-8418-032327087ace"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.119411 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d5be6998-0b42-475a-8418-032327087ace" (UID: "d5be6998-0b42-475a-8418-032327087ace"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.125983 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-scripts" (OuterVolumeSpecName: "scripts") pod "d5be6998-0b42-475a-8418-032327087ace" (UID: "d5be6998-0b42-475a-8418-032327087ace"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.158463 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5be6998-0b42-475a-8418-032327087ace" (UID: "d5be6998-0b42-475a-8418-032327087ace"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.160366 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-config-data" (OuterVolumeSpecName: "config-data") pod "d5be6998-0b42-475a-8418-032327087ace" (UID: "d5be6998-0b42-475a-8418-032327087ace"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.179548 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d5be6998-0b42-475a-8418-032327087ace" (UID: "d5be6998-0b42-475a-8418-032327087ace"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.189146 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d5be6998-0b42-475a-8418-032327087ace" (UID: "d5be6998-0b42-475a-8418-032327087ace"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.214006 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.214039 4882 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.214050 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.214059 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.214071 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.214079 4882 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.214088 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrrcs\" (UniqueName: \"kubernetes.io/projected/d5be6998-0b42-475a-8418-032327087ace-kube-api-access-mrrcs\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.214096 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5be6998-0b42-475a-8418-032327087ace-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.308668 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.319660 4882 generic.go:334] "Generic (PLEG): container finished" podID="d5be6998-0b42-475a-8418-032327087ace" containerID="02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398" exitCode=0 Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.319763 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7df8f99cc4-7tfd9" event={"ID":"d5be6998-0b42-475a-8418-032327087ace","Type":"ContainerDied","Data":"02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398"} Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.319809 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7df8f99cc4-7tfd9" event={"ID":"d5be6998-0b42-475a-8418-032327087ace","Type":"ContainerDied","Data":"944ba7b88bc462c3ad42cc6d9d42d3f227e589ace3a3294c22873e53a3517968"} Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.319814 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7df8f99cc4-7tfd9" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.319833 4882 scope.go:117] "RemoveContainer" containerID="02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.325850 4882 generic.go:334] "Generic (PLEG): container finished" podID="b883970f-7b20-4f83-9b05-3b0469caf183" containerID="c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45" exitCode=0 Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.325928 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.325953 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b883970f-7b20-4f83-9b05-3b0469caf183","Type":"ContainerDied","Data":"c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45"} Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.326014 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b883970f-7b20-4f83-9b05-3b0469caf183","Type":"ContainerDied","Data":"7cefb0afebab7524603ee415809f5c8a71b000947237e97f22da960d93f032b9"} Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.335067 4882 generic.go:334] "Generic (PLEG): container finished" podID="fe1004bf-948f-4aae-b19b-a1321eab3b03" containerID="37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36" exitCode=0 Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.335154 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fe1004bf-948f-4aae-b19b-a1321eab3b03","Type":"ContainerDied","Data":"37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36"} Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.335179 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fe1004bf-948f-4aae-b19b-a1321eab3b03","Type":"ContainerDied","Data":"baf46152ef0a6b3a70cbdb5edf7d9d04c22b36892b776a0c61643b1e1e801dc9"} Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.335277 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.356068 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.377528 4882 scope.go:117] "RemoveContainer" containerID="02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398" Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.380402 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398\": container with ID starting with 02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398 not found: ID does not exist" containerID="02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.380810 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398"} err="failed to get container status \"02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398\": rpc error: code = NotFound desc = could not find container \"02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398\": container with ID starting with 02fdd3dab398b29a2cdee3cc3aec32548aefa47c0f4b4f34bd0c7f8d0fb5d398 not found: ID does not exist" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.380850 4882 scope.go:117] "RemoveContainer" containerID="c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.387017 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7df8f99cc4-7tfd9"] Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.397538 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7df8f99cc4-7tfd9"] Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.402374 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.409814 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.414652 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.418450 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-combined-ca-bundle\") pod \"b883970f-7b20-4f83-9b05-3b0469caf183\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.418540 4882 scope.go:117] "RemoveContainer" containerID="c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.418605 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llfqf\" (UniqueName: \"kubernetes.io/projected/b883970f-7b20-4f83-9b05-3b0469caf183-kube-api-access-llfqf\") pod \"b883970f-7b20-4f83-9b05-3b0469caf183\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.418638 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-config-data\") pod \"b883970f-7b20-4f83-9b05-3b0469caf183\" (UID: \"b883970f-7b20-4f83-9b05-3b0469caf183\") " Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.418680 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.419499 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45\": container with ID starting with c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45 not found: ID does not exist" containerID="c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.419553 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45"} err="failed to get container status \"c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45\": rpc error: code = NotFound desc = could not find container \"c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45\": container with ID starting with c6896f926796b9638775222e7708585de526a55808faeeab98b57712bb996c45 not found: ID does not exist" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.419589 4882 scope.go:117] "RemoveContainer" containerID="37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.422276 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b883970f-7b20-4f83-9b05-3b0469caf183-kube-api-access-llfqf" (OuterVolumeSpecName: "kube-api-access-llfqf") pod "b883970f-7b20-4f83-9b05-3b0469caf183" (UID: "b883970f-7b20-4f83-9b05-3b0469caf183"). InnerVolumeSpecName "kube-api-access-llfqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.437883 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b883970f-7b20-4f83-9b05-3b0469caf183" (UID: "b883970f-7b20-4f83-9b05-3b0469caf183"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.442206 4882 scope.go:117] "RemoveContainer" containerID="97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.444861 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-config-data" (OuterVolumeSpecName: "config-data") pod "b883970f-7b20-4f83-9b05-3b0469caf183" (UID: "b883970f-7b20-4f83-9b05-3b0469caf183"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.469606 4882 scope.go:117] "RemoveContainer" containerID="37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36" Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.472923 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36\": container with ID starting with 37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36 not found: ID does not exist" containerID="37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.472972 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36"} err="failed to get container status \"37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36\": rpc error: code = NotFound desc = could not find container \"37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36\": container with ID starting with 37c5a6cedcf139905e25b7538dc80bf96a38b27fe90f0eb5d90dd66d2b792c36 not found: ID does not exist" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.473006 4882 scope.go:117] "RemoveContainer" containerID="97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854" Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.473333 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854\": container with ID starting with 97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854 not found: ID does not exist" containerID="97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.473457 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854"} err="failed to get container status \"97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854\": rpc error: code = NotFound desc = could not find container \"97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854\": container with ID starting with 97516c217c91987ce391503a4babfdf913b49b5b8a2dec9fa7175a3ed6b5d854 not found: ID does not exist" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.520760 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.520799 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llfqf\" (UniqueName: \"kubernetes.io/projected/b883970f-7b20-4f83-9b05-3b0469caf183-kube-api-access-llfqf\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.520811 4882 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b883970f-7b20-4f83-9b05-3b0469caf183-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.571765 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.572173 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.572591 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.572637 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.575918 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.578969 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.580829 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:46 crc kubenswrapper[4882]: E1002 16:41:46.580871 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.659484 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.667542 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.782104 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" path="/var/lib/kubelet/pods/05fb59c5-aa61-4ec3-866f-3a4551737f80/volumes" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.782833 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" path="/var/lib/kubelet/pods/1dd77366-9ed5-4258-b61c-70c6cd95d5c6/volumes" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.784189 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" path="/var/lib/kubelet/pods/74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42/volumes" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.784750 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b883970f-7b20-4f83-9b05-3b0469caf183" path="/var/lib/kubelet/pods/b883970f-7b20-4f83-9b05-3b0469caf183/volumes" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.785190 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5be6998-0b42-475a-8418-032327087ace" path="/var/lib/kubelet/pods/d5be6998-0b42-475a-8418-032327087ace/volumes" Oct 02 16:41:46 crc kubenswrapper[4882]: I1002 16:41:46.786908 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe1004bf-948f-4aae-b19b-a1321eab3b03" path="/var/lib/kubelet/pods/fe1004bf-948f-4aae-b19b-a1321eab3b03/volumes" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.884855 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c45zs"] Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885690 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="ceilometer-central-agent" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885704 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="ceilometer-central-agent" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885714 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ac3371-e1ab-4d5d-a543-9b9b68a0118a" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885721 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ac3371-e1ab-4d5d-a543-9b9b68a0118a" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885732 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885739 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885748 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="ceilometer-notification-agent" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885754 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="ceilometer-notification-agent" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885763 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e85b8e-fa65-4016-adbb-e72100f18388" containerName="nova-scheduler-scheduler" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885769 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e85b8e-fa65-4016-adbb-e72100f18388" containerName="nova-scheduler-scheduler" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885785 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerName="barbican-keystone-listener-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885792 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerName="barbican-keystone-listener-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885799 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerName="proxy-server" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885805 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerName="proxy-server" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885814 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2c7127-3671-4e79-aad9-01146803019e" containerName="nova-cell1-conductor-conductor" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885820 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2c7127-3671-4e79-aad9-01146803019e" containerName="nova-cell1-conductor-conductor" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885831 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" containerName="memcached" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885837 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" containerName="memcached" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885847 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c726379-ca3b-428c-8091-c1870692c652" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885854 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c726379-ca3b-428c-8091-c1870692c652" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885865 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261d06a1-c07d-4430-9984-24531fa935c6" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885872 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="261d06a1-c07d-4430-9984-24531fa935c6" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885882 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b156c0c6-4395-4609-8260-5ee8943d6813" containerName="mysql-bootstrap" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885888 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b156c0c6-4395-4609-8260-5ee8943d6813" containerName="mysql-bootstrap" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885898 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="proxy-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885903 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="proxy-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885914 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerName="ovsdbserver-nb" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885919 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerName="ovsdbserver-nb" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885926 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerName="cinder-scheduler" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885934 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerName="cinder-scheduler" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885944 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9a74a6-19b0-4ab2-a047-9ff9c13137d7" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885950 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a74a6-19b0-4ab2-a047-9ff9c13137d7" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885961 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885966 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885975 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1313f5-3aed-43a9-881d-bf61353ab6bd" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885982 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1313f5-3aed-43a9-881d-bf61353ab6bd" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.885993 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerName="rabbitmq" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.885998 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerName="rabbitmq" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886007 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886013 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886020 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerName="barbican-worker-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886026 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerName="barbican-worker-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886035 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b05798-486d-493e-90eb-c09edb4bdc96" containerName="kube-state-metrics" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886041 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b05798-486d-493e-90eb-c09edb4bdc96" containerName="kube-state-metrics" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886053 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886059 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886071 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b883970f-7b20-4f83-9b05-3b0469caf183" containerName="nova-cell0-conductor-conductor" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886077 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b883970f-7b20-4f83-9b05-3b0469caf183" containerName="nova-cell0-conductor-conductor" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886086 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886092 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886099 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886105 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886112 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerName="proxy-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886118 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerName="proxy-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886128 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f4b488-20e6-4007-b1b4-891b06b16276" containerName="init" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886135 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f4b488-20e6-4007-b1b4-891b06b16276" containerName="init" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886144 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerName="rabbitmq" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886150 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerName="rabbitmq" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886162 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886169 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886179 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerName="setup-container" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886186 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerName="setup-container" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886193 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-metadata" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886199 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-metadata" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886208 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886229 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886235 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" containerName="ovsdbserver-sb" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886240 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" containerName="ovsdbserver-sb" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886249 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerName="placement-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886255 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerName="placement-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886266 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886338 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886351 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1004bf-948f-4aae-b19b-a1321eab3b03" containerName="galera" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886357 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1004bf-948f-4aae-b19b-a1321eab3b03" containerName="galera" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886365 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerName="probe" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886371 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerName="probe" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886381 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f4b488-20e6-4007-b1b4-891b06b16276" containerName="dnsmasq-dns" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886387 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f4b488-20e6-4007-b1b4-891b06b16276" containerName="dnsmasq-dns" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886395 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerName="setup-container" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886401 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerName="setup-container" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886411 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerName="ovn-controller" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886417 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerName="ovn-controller" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886423 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886429 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886439 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886445 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-api" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886456 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b156c0c6-4395-4609-8260-5ee8943d6813" containerName="galera" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886462 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b156c0c6-4395-4609-8260-5ee8943d6813" containerName="galera" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886470 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerName="glance-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886476 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerName="glance-log" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886484 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="ovn-northd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886490 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="ovn-northd" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886500 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="sg-core" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886506 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="sg-core" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886516 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886522 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886529 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerName="barbican-worker" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886534 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerName="barbican-worker" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886546 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerName="glance-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886552 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerName="glance-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886562 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5be6998-0b42-475a-8418-032327087ace" containerName="keystone-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886568 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5be6998-0b42-475a-8418-032327087ace" containerName="keystone-api" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886579 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerName="placement-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886584 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerName="placement-api" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886590 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886596 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886605 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerName="barbican-keystone-listener" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886611 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerName="barbican-keystone-listener" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886616 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a85d34b-abed-4d9a-aa75-9781d96a4c8b" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886624 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a85d34b-abed-4d9a-aa75-9781d96a4c8b" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886634 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92887968-fdd5-4653-a151-70e4a8f963fc" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886640 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="92887968-fdd5-4653-a151-70e4a8f963fc" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: E1002 16:41:49.886654 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1004bf-948f-4aae-b19b-a1321eab3b03" containerName="mysql-bootstrap" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886662 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1004bf-948f-4aae-b19b-a1321eab3b03" containerName="mysql-bootstrap" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886819 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886832 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerName="barbican-worker-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886840 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerName="placement-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886851 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1313f5-3aed-43a9-881d-bf61353ab6bd" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886863 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886870 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b05798-486d-493e-90eb-c09edb4bdc96" containerName="kube-state-metrics" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886877 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886882 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c726379-ca3b-428c-8091-c1870692c652" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886888 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1004bf-948f-4aae-b19b-a1321eab3b03" containerName="galera" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886896 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886906 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ada43e-a36e-49c7-bc9e-6c3151d2eb6b" containerName="memcached" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886914 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886922 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerName="proxy-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886931 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fb59c5-aa61-4ec3-866f-3a4551737f80" containerName="ovn-northd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886940 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerName="barbican-keystone-listener-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886946 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a85d34b-abed-4d9a-aa75-9781d96a4c8b" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886958 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerName="glance-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886964 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="142d02f0-5616-42b6-b6fc-b37df2639f8a" containerName="ovsdbserver-nb" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886970 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a33ca09-ff99-44fd-a978-ef69315caf26" containerName="cinder-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886980 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="ceilometer-notification-agent" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886990 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b156c0c6-4395-4609-8260-5ee8943d6813" containerName="galera" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.886998 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5be6998-0b42-475a-8418-032327087ace" containerName="keystone-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887007 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" containerName="ovsdbserver-sb" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887015 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de09a9-9a37-4c03-abd4-002230d4f583" containerName="glance-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887025 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f4b488-20e6-4007-b1b4-891b06b16276" containerName="dnsmasq-dns" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887033 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a0ea19-b66f-4dc9-95a5-b6dd8fc3eb42" containerName="rabbitmq" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887044 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="sg-core" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887053 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887059 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2c7127-3671-4e79-aad9-01146803019e" containerName="nova-cell1-conductor-conductor" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887068 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6733b4d-ebf1-43cd-9960-3c25fca82e64" containerName="barbican-keystone-listener" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887077 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8a47fb-b921-4b63-9529-a49d1ec506fb" containerName="barbican-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887085 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5a965f-ea7b-44a8-9fc4-6cd88f67bdfc" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887093 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e85b8e-fa65-4016-adbb-e72100f18388" containerName="nova-scheduler-scheduler" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887099 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e205184e-bcf8-498d-8a1a-bc1c8539c2ae" containerName="ovn-controller" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887108 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b80a4ca1-90cc-4c29-a2de-13b4db198cef" containerName="barbican-worker" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887117 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerName="probe" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887123 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4f3fde-e95f-404d-baac-1c6238494afa" containerName="rabbitmq" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887133 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8cf6351-a2e4-475f-a9f2-9006fee40049" containerName="glance-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887141 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887150 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9a74a6-19b0-4ab2-a047-9ff9c13137d7" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887155 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe9edcd-b8d3-4b0f-8e0f-dd81a56da636" containerName="proxy-server" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887164 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b883970f-7b20-4f83-9b05-3b0469caf183" containerName="nova-cell0-conductor-conductor" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887175 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="proxy-httpd" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887183 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="6537afcb-4015-45f6-bdb5-68e0625c6ea6" containerName="placement-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887189 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="909dd4dd-0b5d-4b6b-b64a-7c59cd26256f" containerName="nova-api-api" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887198 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-log" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887223 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2f3ba0-9530-4875-84e1-df99cc4761a6" containerName="cinder-scheduler" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887230 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="261d06a1-c07d-4430-9984-24531fa935c6" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887238 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd77366-9ed5-4258-b61c-70c6cd95d5c6" containerName="ceilometer-central-agent" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887246 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="92887968-fdd5-4653-a151-70e4a8f963fc" containerName="openstack-network-exporter" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887255 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b37b99-d806-4dce-a73e-653f8ebc5567" containerName="nova-metadata-metadata" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.887263 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ac3371-e1ab-4d5d-a543-9b9b68a0118a" containerName="mariadb-account-delete" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.888358 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.898142 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c45zs"] Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.967076 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fql22\" (UniqueName: \"kubernetes.io/projected/e835f425-aa72-4aa1-aadf-b190ea334fe2-kube-api-access-fql22\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.967130 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-utilities\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:49 crc kubenswrapper[4882]: I1002 16:41:49.967169 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-catalog-content\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.068738 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-catalog-content\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.069036 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fql22\" (UniqueName: \"kubernetes.io/projected/e835f425-aa72-4aa1-aadf-b190ea334fe2-kube-api-access-fql22\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.069109 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-utilities\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.069587 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-utilities\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.069781 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-catalog-content\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.090711 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fql22\" (UniqueName: \"kubernetes.io/projected/e835f425-aa72-4aa1-aadf-b190ea334fe2-kube-api-access-fql22\") pod \"community-operators-c45zs\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.247560 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.256737 4882 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7994f6475f-bw8mv" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Oct 02 16:41:50 crc kubenswrapper[4882]: I1002 16:41:50.614881 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c45zs"] Oct 02 16:41:51 crc kubenswrapper[4882]: I1002 16:41:51.414972 4882 generic.go:334] "Generic (PLEG): container finished" podID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerID="270ef41f5adc8533d9617327d05c65d3b2fdd4a5b624282402752280195b957c" exitCode=0 Oct 02 16:41:51 crc kubenswrapper[4882]: I1002 16:41:51.415088 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c45zs" event={"ID":"e835f425-aa72-4aa1-aadf-b190ea334fe2","Type":"ContainerDied","Data":"270ef41f5adc8533d9617327d05c65d3b2fdd4a5b624282402752280195b957c"} Oct 02 16:41:51 crc kubenswrapper[4882]: I1002 16:41:51.415327 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c45zs" event={"ID":"e835f425-aa72-4aa1-aadf-b190ea334fe2","Type":"ContainerStarted","Data":"c2b651e7519d2f263f14f3404568178c74d3759f7d43eeb6878a9bc54418e358"} Oct 02 16:41:51 crc kubenswrapper[4882]: E1002 16:41:51.572522 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:51 crc kubenswrapper[4882]: E1002 16:41:51.573160 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:51 crc kubenswrapper[4882]: E1002 16:41:51.574186 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:51 crc kubenswrapper[4882]: E1002 16:41:51.574250 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:51 crc kubenswrapper[4882]: E1002 16:41:51.574291 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:41:51 crc kubenswrapper[4882]: E1002 16:41:51.576368 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:51 crc kubenswrapper[4882]: E1002 16:41:51.579481 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:51 crc kubenswrapper[4882]: E1002 16:41:51.579572 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:41:53 crc kubenswrapper[4882]: I1002 16:41:53.437935 4882 generic.go:334] "Generic (PLEG): container finished" podID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerID="f3918b61a7037413f9886fa2a7fcd8ea9dd99b51ff30ca7dfdfcc1ca919fe931" exitCode=0 Oct 02 16:41:53 crc kubenswrapper[4882]: I1002 16:41:53.438086 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c45zs" event={"ID":"e835f425-aa72-4aa1-aadf-b190ea334fe2","Type":"ContainerDied","Data":"f3918b61a7037413f9886fa2a7fcd8ea9dd99b51ff30ca7dfdfcc1ca919fe931"} Oct 02 16:41:54 crc kubenswrapper[4882]: I1002 16:41:54.453583 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c45zs" event={"ID":"e835f425-aa72-4aa1-aadf-b190ea334fe2","Type":"ContainerStarted","Data":"9c2130a2e7ee52874d183a15819b1126f9b54d62fe74a494236a4c3dfa1fa761"} Oct 02 16:41:54 crc kubenswrapper[4882]: I1002 16:41:54.454731 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:54 crc kubenswrapper[4882]: I1002 16:41:54.474993 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c45zs" podStartSLOduration=3.01725411 podStartE2EDuration="5.474972364s" podCreationTimestamp="2025-10-02 16:41:49 +0000 UTC" firstStartedPulling="2025-10-02 16:41:51.416337831 +0000 UTC m=+1470.165567358" lastFinishedPulling="2025-10-02 16:41:53.874056095 +0000 UTC m=+1472.623285612" observedRunningTime="2025-10-02 16:41:54.472279467 +0000 UTC m=+1473.221508994" watchObservedRunningTime="2025-10-02 16:41:54.474972364 +0000 UTC m=+1473.224201891" Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.262941 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzll4"] Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.263618 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hzll4" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerName="registry-server" containerID="cri-o://affa33b2bf077d80aecdfb407f2de76819f10f06b17bbaa1e1f4ae1023900018" gracePeriod=2 Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.481204 4882 generic.go:334] "Generic (PLEG): container finished" podID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerID="affa33b2bf077d80aecdfb407f2de76819f10f06b17bbaa1e1f4ae1023900018" exitCode=0 Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.481251 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzll4" event={"ID":"dc1acdb9-23dd-43cd-b568-0e6a04f0db71","Type":"ContainerDied","Data":"affa33b2bf077d80aecdfb407f2de76819f10f06b17bbaa1e1f4ae1023900018"} Oct 02 16:41:56 crc kubenswrapper[4882]: E1002 16:41:56.571924 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:56 crc kubenswrapper[4882]: E1002 16:41:56.572191 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:56 crc kubenswrapper[4882]: E1002 16:41:56.572406 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:41:56 crc kubenswrapper[4882]: E1002 16:41:56.572434 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:41:56 crc kubenswrapper[4882]: E1002 16:41:56.575346 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:56 crc kubenswrapper[4882]: E1002 16:41:56.577276 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:56 crc kubenswrapper[4882]: E1002 16:41:56.578455 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:41:56 crc kubenswrapper[4882]: E1002 16:41:56.578530 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.662394 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.778326 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-catalog-content\") pod \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.778389 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qgsc\" (UniqueName: \"kubernetes.io/projected/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-kube-api-access-9qgsc\") pod \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.778533 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-utilities\") pod \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\" (UID: \"dc1acdb9-23dd-43cd-b568-0e6a04f0db71\") " Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.779356 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-utilities" (OuterVolumeSpecName: "utilities") pod "dc1acdb9-23dd-43cd-b568-0e6a04f0db71" (UID: "dc1acdb9-23dd-43cd-b568-0e6a04f0db71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.783806 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-kube-api-access-9qgsc" (OuterVolumeSpecName: "kube-api-access-9qgsc") pod "dc1acdb9-23dd-43cd-b568-0e6a04f0db71" (UID: "dc1acdb9-23dd-43cd-b568-0e6a04f0db71"). InnerVolumeSpecName "kube-api-access-9qgsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.804370 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc1acdb9-23dd-43cd-b568-0e6a04f0db71" (UID: "dc1acdb9-23dd-43cd-b568-0e6a04f0db71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.881790 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.881856 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:56 crc kubenswrapper[4882]: I1002 16:41:56.881889 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qgsc\" (UniqueName: \"kubernetes.io/projected/dc1acdb9-23dd-43cd-b568-0e6a04f0db71-kube-api-access-9qgsc\") on node \"crc\" DevicePath \"\"" Oct 02 16:41:57 crc kubenswrapper[4882]: I1002 16:41:57.493453 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzll4" event={"ID":"dc1acdb9-23dd-43cd-b568-0e6a04f0db71","Type":"ContainerDied","Data":"0569fe3ebe9a74515d4b6735e93a13ed9e9165e9474837bc90084d4a0ad1fe70"} Oct 02 16:41:57 crc kubenswrapper[4882]: I1002 16:41:57.493518 4882 scope.go:117] "RemoveContainer" containerID="affa33b2bf077d80aecdfb407f2de76819f10f06b17bbaa1e1f4ae1023900018" Oct 02 16:41:57 crc kubenswrapper[4882]: I1002 16:41:57.493556 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzll4" Oct 02 16:41:57 crc kubenswrapper[4882]: I1002 16:41:57.516511 4882 scope.go:117] "RemoveContainer" containerID="591e0b2c9d313cf5f0e413e1051aa9585b0f95b568ba3d808feaeffe7aa56741" Oct 02 16:41:57 crc kubenswrapper[4882]: I1002 16:41:57.534026 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzll4"] Oct 02 16:41:57 crc kubenswrapper[4882]: I1002 16:41:57.536100 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzll4"] Oct 02 16:41:57 crc kubenswrapper[4882]: I1002 16:41:57.549925 4882 scope.go:117] "RemoveContainer" containerID="7a6b59e62874a5415470e6e26597234895bad91ce2daf76f62f504d8922025ec" Oct 02 16:41:58 crc kubenswrapper[4882]: I1002 16:41:58.772246 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" path="/var/lib/kubelet/pods/dc1acdb9-23dd-43cd-b568-0e6a04f0db71/volumes" Oct 02 16:41:59 crc kubenswrapper[4882]: I1002 16:41:59.515139 4882 generic.go:334] "Generic (PLEG): container finished" podID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerID="94c670bb838359315515a372b7547069910917af30838091af0996d23b21c9e8" exitCode=0 Oct 02 16:41:59 crc kubenswrapper[4882]: I1002 16:41:59.515202 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7994f6475f-bw8mv" event={"ID":"0fd93bba-dd83-4256-952a-d60fd3cefef4","Type":"ContainerDied","Data":"94c670bb838359315515a372b7547069910917af30838091af0996d23b21c9e8"} Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.057098 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.133862 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-config\") pod \"0fd93bba-dd83-4256-952a-d60fd3cefef4\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.133922 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-public-tls-certs\") pod \"0fd93bba-dd83-4256-952a-d60fd3cefef4\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.133939 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-ovndb-tls-certs\") pod \"0fd93bba-dd83-4256-952a-d60fd3cefef4\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.133974 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-httpd-config\") pod \"0fd93bba-dd83-4256-952a-d60fd3cefef4\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.134086 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-internal-tls-certs\") pod \"0fd93bba-dd83-4256-952a-d60fd3cefef4\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.134112 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-combined-ca-bundle\") pod \"0fd93bba-dd83-4256-952a-d60fd3cefef4\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.134151 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sw6q\" (UniqueName: \"kubernetes.io/projected/0fd93bba-dd83-4256-952a-d60fd3cefef4-kube-api-access-4sw6q\") pod \"0fd93bba-dd83-4256-952a-d60fd3cefef4\" (UID: \"0fd93bba-dd83-4256-952a-d60fd3cefef4\") " Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.139191 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0fd93bba-dd83-4256-952a-d60fd3cefef4" (UID: "0fd93bba-dd83-4256-952a-d60fd3cefef4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.148469 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd93bba-dd83-4256-952a-d60fd3cefef4-kube-api-access-4sw6q" (OuterVolumeSpecName: "kube-api-access-4sw6q") pod "0fd93bba-dd83-4256-952a-d60fd3cefef4" (UID: "0fd93bba-dd83-4256-952a-d60fd3cefef4"). InnerVolumeSpecName "kube-api-access-4sw6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.173497 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-config" (OuterVolumeSpecName: "config") pod "0fd93bba-dd83-4256-952a-d60fd3cefef4" (UID: "0fd93bba-dd83-4256-952a-d60fd3cefef4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.173643 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0fd93bba-dd83-4256-952a-d60fd3cefef4" (UID: "0fd93bba-dd83-4256-952a-d60fd3cefef4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.175517 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fd93bba-dd83-4256-952a-d60fd3cefef4" (UID: "0fd93bba-dd83-4256-952a-d60fd3cefef4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.188256 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0fd93bba-dd83-4256-952a-d60fd3cefef4" (UID: "0fd93bba-dd83-4256-952a-d60fd3cefef4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.206897 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0fd93bba-dd83-4256-952a-d60fd3cefef4" (UID: "0fd93bba-dd83-4256-952a-d60fd3cefef4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.236947 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sw6q\" (UniqueName: \"kubernetes.io/projected/0fd93bba-dd83-4256-952a-d60fd3cefef4-kube-api-access-4sw6q\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.236985 4882 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.236998 4882 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.237007 4882 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.237017 4882 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.237027 4882 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.237036 4882 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd93bba-dd83-4256-952a-d60fd3cefef4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.249294 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.249661 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.297981 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.537241 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7994f6475f-bw8mv" event={"ID":"0fd93bba-dd83-4256-952a-d60fd3cefef4","Type":"ContainerDied","Data":"c7c496ab2be5e70623612e2f8bc917a18fad6211fd9c5cf636941cdde346d7ed"} Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.537554 4882 scope.go:117] "RemoveContainer" containerID="0917f875d2bdf0cb2a5ee7365787c2bcba638e2049e9094611df5f28dc9f15e9" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.537296 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7994f6475f-bw8mv" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.574732 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7994f6475f-bw8mv"] Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.577666 4882 scope.go:117] "RemoveContainer" containerID="94c670bb838359315515a372b7547069910917af30838091af0996d23b21c9e8" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.579457 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7994f6475f-bw8mv"] Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.601551 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:42:00 crc kubenswrapper[4882]: I1002 16:42:00.771833 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" path="/var/lib/kubelet/pods/0fd93bba-dd83-4256-952a-d60fd3cefef4/volumes" Oct 02 16:42:01 crc kubenswrapper[4882]: I1002 16:42:01.461284 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c45zs"] Oct 02 16:42:01 crc kubenswrapper[4882]: E1002 16:42:01.571360 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:42:01 crc kubenswrapper[4882]: E1002 16:42:01.571655 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:42:01 crc kubenswrapper[4882]: E1002 16:42:01.572879 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:42:01 crc kubenswrapper[4882]: E1002 16:42:01.572919 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:42:01 crc kubenswrapper[4882]: E1002 16:42:01.573104 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:42:01 crc kubenswrapper[4882]: E1002 16:42:01.574413 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:42:01 crc kubenswrapper[4882]: E1002 16:42:01.575964 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:42:01 crc kubenswrapper[4882]: E1002 16:42:01.576055 4882 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:42:02 crc kubenswrapper[4882]: I1002 16:42:02.567009 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c45zs" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerName="registry-server" containerID="cri-o://9c2130a2e7ee52874d183a15819b1126f9b54d62fe74a494236a4c3dfa1fa761" gracePeriod=2 Oct 02 16:42:03 crc kubenswrapper[4882]: I1002 16:42:03.576136 4882 generic.go:334] "Generic (PLEG): container finished" podID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerID="9c2130a2e7ee52874d183a15819b1126f9b54d62fe74a494236a4c3dfa1fa761" exitCode=0 Oct 02 16:42:03 crc kubenswrapper[4882]: I1002 16:42:03.576223 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c45zs" event={"ID":"e835f425-aa72-4aa1-aadf-b190ea334fe2","Type":"ContainerDied","Data":"9c2130a2e7ee52874d183a15819b1126f9b54d62fe74a494236a4c3dfa1fa761"} Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.351846 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.403709 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-catalog-content\") pod \"e835f425-aa72-4aa1-aadf-b190ea334fe2\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.403800 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fql22\" (UniqueName: \"kubernetes.io/projected/e835f425-aa72-4aa1-aadf-b190ea334fe2-kube-api-access-fql22\") pod \"e835f425-aa72-4aa1-aadf-b190ea334fe2\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.403877 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-utilities\") pod \"e835f425-aa72-4aa1-aadf-b190ea334fe2\" (UID: \"e835f425-aa72-4aa1-aadf-b190ea334fe2\") " Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.405464 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-utilities" (OuterVolumeSpecName: "utilities") pod "e835f425-aa72-4aa1-aadf-b190ea334fe2" (UID: "e835f425-aa72-4aa1-aadf-b190ea334fe2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.413540 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e835f425-aa72-4aa1-aadf-b190ea334fe2-kube-api-access-fql22" (OuterVolumeSpecName: "kube-api-access-fql22") pod "e835f425-aa72-4aa1-aadf-b190ea334fe2" (UID: "e835f425-aa72-4aa1-aadf-b190ea334fe2"). InnerVolumeSpecName "kube-api-access-fql22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.506414 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fql22\" (UniqueName: \"kubernetes.io/projected/e835f425-aa72-4aa1-aadf-b190ea334fe2-kube-api-access-fql22\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.506454 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.590873 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c45zs" event={"ID":"e835f425-aa72-4aa1-aadf-b190ea334fe2","Type":"ContainerDied","Data":"c2b651e7519d2f263f14f3404568178c74d3759f7d43eeb6878a9bc54418e358"} Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.590960 4882 scope.go:117] "RemoveContainer" containerID="9c2130a2e7ee52874d183a15819b1126f9b54d62fe74a494236a4c3dfa1fa761" Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.591207 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c45zs" Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.617337 4882 scope.go:117] "RemoveContainer" containerID="f3918b61a7037413f9886fa2a7fcd8ea9dd99b51ff30ca7dfdfcc1ca919fe931" Oct 02 16:42:04 crc kubenswrapper[4882]: I1002 16:42:04.643034 4882 scope.go:117] "RemoveContainer" containerID="270ef41f5adc8533d9617327d05c65d3b2fdd4a5b624282402752280195b957c" Oct 02 16:42:05 crc kubenswrapper[4882]: I1002 16:42:05.616369 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wsqlw_21667760-8ee1-456b-af11-a501cdf77822/ovs-vswitchd/0.log" Oct 02 16:42:05 crc kubenswrapper[4882]: I1002 16:42:05.617326 4882 generic.go:334] "Generic (PLEG): container finished" podID="21667760-8ee1-456b-af11-a501cdf77822" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" exitCode=137 Oct 02 16:42:05 crc kubenswrapper[4882]: I1002 16:42:05.617363 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wsqlw" event={"ID":"21667760-8ee1-456b-af11-a501cdf77822","Type":"ContainerDied","Data":"8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490"} Oct 02 16:42:06 crc kubenswrapper[4882]: E1002 16:42:06.571583 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490 is running failed: container process not found" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:42:06 crc kubenswrapper[4882]: E1002 16:42:06.571837 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:42:06 crc kubenswrapper[4882]: E1002 16:42:06.572099 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490 is running failed: container process not found" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:42:06 crc kubenswrapper[4882]: E1002 16:42:06.572469 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:42:06 crc kubenswrapper[4882]: E1002 16:42:06.572826 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 16:42:06 crc kubenswrapper[4882]: E1002 16:42:06.572877 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:42:06 crc kubenswrapper[4882]: E1002 16:42:06.573164 4882 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490 is running failed: container process not found" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 16:42:06 crc kubenswrapper[4882]: E1002 16:42:06.573240 4882 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-wsqlw" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:42:07 crc kubenswrapper[4882]: I1002 16:42:07.644821 4882 generic.go:334] "Generic (PLEG): container finished" podID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerID="beb784c14f77f12a3964b2ef2082da3578704e6f5267fe025604c315cd46e6ba" exitCode=137 Oct 02 16:42:07 crc kubenswrapper[4882]: I1002 16:42:07.644908 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"beb784c14f77f12a3964b2ef2082da3578704e6f5267fe025604c315cd46e6ba"} Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.090489 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e835f425-aa72-4aa1-aadf-b190ea334fe2" (UID: "e835f425-aa72-4aa1-aadf-b190ea334fe2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.165527 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e835f425-aa72-4aa1-aadf-b190ea334fe2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.230849 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c45zs"] Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.239111 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c45zs"] Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.520686 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.578411 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") pod \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.578585 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-lock\") pod \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.578710 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.578756 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjmgj\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-kube-api-access-tjmgj\") pod \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.578951 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-cache\") pod \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\" (UID: \"9cd26acb-1d48-48f3-b39d-b274bdcd3cce\") " Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.579351 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-lock" (OuterVolumeSpecName: "lock") pod "9cd26acb-1d48-48f3-b39d-b274bdcd3cce" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.579567 4882 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-lock\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.579578 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-cache" (OuterVolumeSpecName: "cache") pod "9cd26acb-1d48-48f3-b39d-b274bdcd3cce" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.587655 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "9cd26acb-1d48-48f3-b39d-b274bdcd3cce" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.588524 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-kube-api-access-tjmgj" (OuterVolumeSpecName: "kube-api-access-tjmgj") pod "9cd26acb-1d48-48f3-b39d-b274bdcd3cce" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce"). InnerVolumeSpecName "kube-api-access-tjmgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.588597 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9cd26acb-1d48-48f3-b39d-b274bdcd3cce" (UID: "9cd26acb-1d48-48f3-b39d-b274bdcd3cce"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.660974 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9cd26acb-1d48-48f3-b39d-b274bdcd3cce","Type":"ContainerDied","Data":"ce8731e432f4f67124e22949151742a3ebe8662b136718c00dba885eb4640bf1"} Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.661064 4882 scope.go:117] "RemoveContainer" containerID="beb784c14f77f12a3964b2ef2082da3578704e6f5267fe025604c315cd46e6ba" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.661150 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.681418 4882 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.681485 4882 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.681501 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjmgj\" (UniqueName: \"kubernetes.io/projected/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-kube-api-access-tjmgj\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.681515 4882 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9cd26acb-1d48-48f3-b39d-b274bdcd3cce-cache\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.699516 4882 scope.go:117] "RemoveContainer" containerID="01c2f3830e5d56cd56ba9cb9e3532e635f8aa8907530ce2426f74c73661d8d82" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.705469 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.708786 4882 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.712031 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.719144 4882 scope.go:117] "RemoveContainer" containerID="7f14fdb5932d30e2872b1a5885f0b55773e7edd9a6545624f4e0a5cc89fc9950" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.749424 4882 scope.go:117] "RemoveContainer" containerID="a02d6df731a02fe06af482e56bb798e1405d30e9a7582568ae70058f44d8649b" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.769852 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" path="/var/lib/kubelet/pods/9cd26acb-1d48-48f3-b39d-b274bdcd3cce/volumes" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.770190 4882 scope.go:117] "RemoveContainer" containerID="dd782c2cb9bfe49afd713e8f98727186b4594352b4f94c3334399dcf0f3ddcde" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.772328 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" path="/var/lib/kubelet/pods/e835f425-aa72-4aa1-aadf-b190ea334fe2/volumes" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.782658 4882 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.886014 4882 scope.go:117] "RemoveContainer" containerID="c324410a34d486d3741d64c1759808560c80771c9771319557a7fa0a96becbf8" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.913890 4882 scope.go:117] "RemoveContainer" containerID="64ceceaaaf2974dd1ac76651c1df1b0fc8e1bf419b93266c480f050b334d0268" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.942164 4882 scope.go:117] "RemoveContainer" containerID="98c3f10e6bd125fee0ba2ce83eeb60ba8e4bb2141fa7130276214ba1b0155863" Oct 02 16:42:08 crc kubenswrapper[4882]: I1002 16:42:08.998707 4882 scope.go:117] "RemoveContainer" containerID="e68f25a45eb857be77e95e76cbef672ca88b8dfae0b6833226a9a98f31867462" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.025454 4882 scope.go:117] "RemoveContainer" containerID="7c21495965895e4c5e03f6e2ccf050c5159064c4ed28968f122e21de15ea7bf0" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.061700 4882 scope.go:117] "RemoveContainer" containerID="0df9344da4286c51066abdcf7f4044bd806c337ad8fb5c3fd6ad431458d2179a" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.082273 4882 scope.go:117] "RemoveContainer" containerID="92d1d3fc20a44da92843f516fd7e287b2820813b7b752be600e88b83707ea9d9" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.120421 4882 scope.go:117] "RemoveContainer" containerID="19db6cdfa5b1c6ddeac4c3af0c7dac82506480145ee4d50af2f88dd3e7515251" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.146580 4882 scope.go:117] "RemoveContainer" containerID="f6dd5331866217f7c3fd5cb62ed15451551f9d0a3146c188905b3c00fb8cb139" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.165282 4882 scope.go:117] "RemoveContainer" containerID="d8caa8721d90f9ef2df17eeeff1e9b45a19acf29fff7c7ee1e067ac7a621cebe" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.552256 4882 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7132edf6-8a37-4230-a3a5-4703be721a78"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7132edf6-8a37-4230-a3a5-4703be721a78] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7132edf6_8a37_4230_a3a5_4703be721a78.slice" Oct 02 16:42:09 crc kubenswrapper[4882]: E1002 16:42:09.552335 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod7132edf6-8a37-4230-a3a5-4703be721a78] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod7132edf6-8a37-4230-a3a5-4703be721a78] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7132edf6_8a37_4230_a3a5_4703be721a78.slice" pod="openstack/ovsdbserver-sb-0" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.673517 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.748362 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wsqlw_21667760-8ee1-456b-af11-a501cdf77822/ovs-vswitchd/0.log" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.750197 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.755427 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.760277 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.795863 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-run\") pod \"21667760-8ee1-456b-af11-a501cdf77822\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.795926 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-lib\") pod \"21667760-8ee1-456b-af11-a501cdf77822\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.795955 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-etc-ovs\") pod \"21667760-8ee1-456b-af11-a501cdf77822\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.795997 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-run" (OuterVolumeSpecName: "var-run") pod "21667760-8ee1-456b-af11-a501cdf77822" (UID: "21667760-8ee1-456b-af11-a501cdf77822"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796007 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-lib" (OuterVolumeSpecName: "var-lib") pod "21667760-8ee1-456b-af11-a501cdf77822" (UID: "21667760-8ee1-456b-af11-a501cdf77822"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796032 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m94dk\" (UniqueName: \"kubernetes.io/projected/21667760-8ee1-456b-af11-a501cdf77822-kube-api-access-m94dk\") pod \"21667760-8ee1-456b-af11-a501cdf77822\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796066 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "21667760-8ee1-456b-af11-a501cdf77822" (UID: "21667760-8ee1-456b-af11-a501cdf77822"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796137 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21667760-8ee1-456b-af11-a501cdf77822-scripts\") pod \"21667760-8ee1-456b-af11-a501cdf77822\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796201 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-log\") pod \"21667760-8ee1-456b-af11-a501cdf77822\" (UID: \"21667760-8ee1-456b-af11-a501cdf77822\") " Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796352 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-log" (OuterVolumeSpecName: "var-log") pod "21667760-8ee1-456b-af11-a501cdf77822" (UID: "21667760-8ee1-456b-af11-a501cdf77822"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796582 4882 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-lib\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796598 4882 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796608 4882 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-log\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.796617 4882 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/21667760-8ee1-456b-af11-a501cdf77822-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.797396 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21667760-8ee1-456b-af11-a501cdf77822-scripts" (OuterVolumeSpecName: "scripts") pod "21667760-8ee1-456b-af11-a501cdf77822" (UID: "21667760-8ee1-456b-af11-a501cdf77822"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.803599 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21667760-8ee1-456b-af11-a501cdf77822-kube-api-access-m94dk" (OuterVolumeSpecName: "kube-api-access-m94dk") pod "21667760-8ee1-456b-af11-a501cdf77822" (UID: "21667760-8ee1-456b-af11-a501cdf77822"). InnerVolumeSpecName "kube-api-access-m94dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.897575 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m94dk\" (UniqueName: \"kubernetes.io/projected/21667760-8ee1-456b-af11-a501cdf77822-kube-api-access-m94dk\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:09 crc kubenswrapper[4882]: I1002 16:42:09.897619 4882 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21667760-8ee1-456b-af11-a501cdf77822-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.689780 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wsqlw_21667760-8ee1-456b-af11-a501cdf77822/ovs-vswitchd/0.log" Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.691357 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wsqlw" event={"ID":"21667760-8ee1-456b-af11-a501cdf77822","Type":"ContainerDied","Data":"d79a25598f3a568bf528327c7efdea8d9bb84f3121c50ff5aa84475e9f96c441"} Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.691425 4882 scope.go:117] "RemoveContainer" containerID="8bb261ef8a35af5af77206b91cd5fa5dd07b3e4257d28da375623181696cf490" Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.691423 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wsqlw" Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.726080 4882 scope.go:117] "RemoveContainer" containerID="6733e7ed183531c5e27f58bc0e509ca10d85000793b439475b09ad8cdfc08547" Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.746350 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-wsqlw"] Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.754416 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-wsqlw"] Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.783196 4882 scope.go:117] "RemoveContainer" containerID="ddb5fdad9fc0ae4cf9f153e2b427d0512f3422d8375ee4e7f8e97b3474db9d80" Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.793976 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21667760-8ee1-456b-af11-a501cdf77822" path="/var/lib/kubelet/pods/21667760-8ee1-456b-af11-a501cdf77822/volumes" Oct 02 16:42:10 crc kubenswrapper[4882]: I1002 16:42:10.795875 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7132edf6-8a37-4230-a3a5-4703be721a78" path="/var/lib/kubelet/pods/7132edf6-8a37-4230-a3a5-4703be721a78/volumes" Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.247443 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cindera969-account-delete-pgcm7" Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.337358 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lr4v\" (UniqueName: \"kubernetes.io/projected/bb7cef74-9b55-4213-a934-7c1d2c058aab-kube-api-access-4lr4v\") pod \"bb7cef74-9b55-4213-a934-7c1d2c058aab\" (UID: \"bb7cef74-9b55-4213-a934-7c1d2c058aab\") " Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.342320 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7cef74-9b55-4213-a934-7c1d2c058aab-kube-api-access-4lr4v" (OuterVolumeSpecName: "kube-api-access-4lr4v") pod "bb7cef74-9b55-4213-a934-7c1d2c058aab" (UID: "bb7cef74-9b55-4213-a934-7c1d2c058aab"). InnerVolumeSpecName "kube-api-access-4lr4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.439271 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lr4v\" (UniqueName: \"kubernetes.io/projected/bb7cef74-9b55-4213-a934-7c1d2c058aab-kube-api-access-4lr4v\") on node \"crc\" DevicePath \"\"" Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.720781 4882 generic.go:334] "Generic (PLEG): container finished" podID="bb7cef74-9b55-4213-a934-7c1d2c058aab" containerID="e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a" exitCode=137 Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.720838 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindera969-account-delete-pgcm7" event={"ID":"bb7cef74-9b55-4213-a934-7c1d2c058aab","Type":"ContainerDied","Data":"e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a"} Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.720900 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindera969-account-delete-pgcm7" event={"ID":"bb7cef74-9b55-4213-a934-7c1d2c058aab","Type":"ContainerDied","Data":"c7c8f20f0aa3b08760289bcc88e807fce20aff7bc381cbd60edcf8168e662923"} Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.720850 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cindera969-account-delete-pgcm7" Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.720931 4882 scope.go:117] "RemoveContainer" containerID="e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a" Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.757838 4882 scope.go:117] "RemoveContainer" containerID="e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a" Oct 02 16:42:12 crc kubenswrapper[4882]: E1002 16:42:12.759334 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a\": container with ID starting with e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a not found: ID does not exist" containerID="e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a" Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.759374 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a"} err="failed to get container status \"e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a\": rpc error: code = NotFound desc = could not find container \"e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a\": container with ID starting with e2ae1103fdf45fd46cc7ebf260c7f3bc6f178a9493a2d6c9b0492238c221f63a not found: ID does not exist" Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.779649 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cindera969-account-delete-pgcm7"] Oct 02 16:42:12 crc kubenswrapper[4882]: I1002 16:42:12.781040 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cindera969-account-delete-pgcm7"] Oct 02 16:42:14 crc kubenswrapper[4882]: I1002 16:42:14.777979 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7cef74-9b55-4213-a934-7c1d2c058aab" path="/var/lib/kubelet/pods/bb7cef74-9b55-4213-a934-7c1d2c058aab/volumes" Oct 02 16:42:26 crc kubenswrapper[4882]: I1002 16:42:26.000921 4882 scope.go:117] "RemoveContainer" containerID="9cda4521dc256012ae948d734158af8a41aa53200ae9a9157a93bd3abc5344ef" Oct 02 16:42:26 crc kubenswrapper[4882]: I1002 16:42:26.043124 4882 scope.go:117] "RemoveContainer" containerID="cabd7eae8001349116cafdd29dd1df7fb3b6e49cd24abaa8c9cf5171da8dae76" Oct 02 16:42:26 crc kubenswrapper[4882]: I1002 16:42:26.070024 4882 scope.go:117] "RemoveContainer" containerID="052368bb59f9409034eaecd11389763ff8b3157c6e295d8f7e64b4884fd743eb" Oct 02 16:42:26 crc kubenswrapper[4882]: I1002 16:42:26.091834 4882 scope.go:117] "RemoveContainer" containerID="fd3d1168a9052070845923b9e8dea6c765e24e4825cb3252fba286141fb23f1a" Oct 02 16:42:26 crc kubenswrapper[4882]: I1002 16:42:26.118541 4882 scope.go:117] "RemoveContainer" containerID="304cdf18a2c81783fa09ece0a20e7edbd4a65aacd48e7453c72803ed54eaa9cd" Oct 02 16:42:39 crc kubenswrapper[4882]: I1002 16:42:39.390877 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:42:39 crc kubenswrapper[4882]: I1002 16:42:39.391575 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:43:09 crc kubenswrapper[4882]: I1002 16:43:09.389740 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:43:09 crc kubenswrapper[4882]: I1002 16:43:09.390248 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:43:26 crc kubenswrapper[4882]: I1002 16:43:26.827734 4882 scope.go:117] "RemoveContainer" containerID="2312e65cccff981a32fb124e131b2d0f9646908568d575f1181ef6833908e204" Oct 02 16:43:26 crc kubenswrapper[4882]: I1002 16:43:26.863701 4882 scope.go:117] "RemoveContainer" containerID="e797ff472429769a19e407c394af10ec378417f006ab301847f9d97baf78cf3a" Oct 02 16:43:26 crc kubenswrapper[4882]: I1002 16:43:26.888608 4882 scope.go:117] "RemoveContainer" containerID="7b2b5dfddc7012648f92825b22fc3dff9d26b581f27c153da193dcb7c40d999c" Oct 02 16:43:26 crc kubenswrapper[4882]: I1002 16:43:26.922600 4882 scope.go:117] "RemoveContainer" containerID="7ccab1093e8e71e85336832cf7c700db0388ad4acfad34fbf7a1afe604fdcb53" Oct 02 16:43:26 crc kubenswrapper[4882]: I1002 16:43:26.956730 4882 scope.go:117] "RemoveContainer" containerID="a698dea8c1459df07df2dd2e5c83bea57c6d7a823b74aae6cc9b205345f7a466" Oct 02 16:43:26 crc kubenswrapper[4882]: I1002 16:43:26.982048 4882 scope.go:117] "RemoveContainer" containerID="861f9f1e4489ef479933426da2698204038748e21a662991a1a2c346c552784f" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.009419 4882 scope.go:117] "RemoveContainer" containerID="d567389cb25bbabc1546297bf134bd90f58b2f6f44e328db815cbb62df15308f" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.060071 4882 scope.go:117] "RemoveContainer" containerID="36cab3f3790ff0ea86f4b5660fdfefeef5565c1d5b18cf3c3eaa32ba022e195f" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.079787 4882 scope.go:117] "RemoveContainer" containerID="a671a16f869006d4662756534716ca55280fdb96d3705b7a0f97ecb4b4a9fd45" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.107044 4882 scope.go:117] "RemoveContainer" containerID="e87d46b8c4b9c4e1d029bb60ac23e02cb1025402fb1c0203620330877f0c8b8a" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.132796 4882 scope.go:117] "RemoveContainer" containerID="fca6ccaa26b7ac77a6a8aeaaa1443eac120b8470f0a5167ae0835b65ccac8088" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.149401 4882 scope.go:117] "RemoveContainer" containerID="1583bc602eccf3526a8f4e6c9a0718e5060e03369a08c1eca224a21e74cae345" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.165709 4882 scope.go:117] "RemoveContainer" containerID="53d3a2dd59d9fce5a2ae1df8874af4314a5c1206051323fd6c315decab3bb56c" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.181369 4882 scope.go:117] "RemoveContainer" containerID="ab30ebb2f1646bffd938b73b39cd780a7082a80b77005b3a0bee01532eaec10e" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.199078 4882 scope.go:117] "RemoveContainer" containerID="bb5a7c2330538131246f478652e9a6a981806ba2ad6fa95077b86414166cb896" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.218398 4882 scope.go:117] "RemoveContainer" containerID="49daefd74c00c9ed3bfcfe52dc5591e1096340894023627744e4282783dddf06" Oct 02 16:43:27 crc kubenswrapper[4882]: I1002 16:43:27.254395 4882 scope.go:117] "RemoveContainer" containerID="c7c2d8087bd15225cd977111ec94e1a88249d610e4918662bc600166c40c96a6" Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.390246 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.390751 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.390798 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.391415 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.391477 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" gracePeriod=600 Oct 02 16:43:39 crc kubenswrapper[4882]: E1002 16:43:39.523996 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.603092 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" exitCode=0 Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.603142 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599"} Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.603182 4882 scope.go:117] "RemoveContainer" containerID="4817b7e232fd2cfe282905e3863c3f81d1a0be19ec05b6f8eef5289d492e445b" Oct 02 16:43:39 crc kubenswrapper[4882]: I1002 16:43:39.603759 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:43:39 crc kubenswrapper[4882]: E1002 16:43:39.604035 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:43:52 crc kubenswrapper[4882]: I1002 16:43:52.771641 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:43:52 crc kubenswrapper[4882]: E1002 16:43:52.774630 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:44:04 crc kubenswrapper[4882]: I1002 16:44:04.761001 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:44:04 crc kubenswrapper[4882]: E1002 16:44:04.762747 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:44:19 crc kubenswrapper[4882]: I1002 16:44:19.760552 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:44:19 crc kubenswrapper[4882]: E1002 16:44:19.761544 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:44:27 crc kubenswrapper[4882]: I1002 16:44:27.499378 4882 scope.go:117] "RemoveContainer" containerID="929fdba2157f4a815e2411de0bf84cb77c8d2439b65e441725cfba13acde8d5c" Oct 02 16:44:27 crc kubenswrapper[4882]: I1002 16:44:27.530804 4882 scope.go:117] "RemoveContainer" containerID="cc8da07ebf41240f0bc3ca44a1ba3b9cdde1ddc0293e1d6702b909eedaa87c97" Oct 02 16:44:27 crc kubenswrapper[4882]: I1002 16:44:27.578001 4882 scope.go:117] "RemoveContainer" containerID="ba198663776df5bcfbbe3aad9beb1270b80995557ab603838a2e4d86db558ad2" Oct 02 16:44:27 crc kubenswrapper[4882]: I1002 16:44:27.631870 4882 scope.go:117] "RemoveContainer" containerID="0a8b48d27f4da685ade95ade8fb027b77858817f1978dd7e0a894eef8c0afdff" Oct 02 16:44:34 crc kubenswrapper[4882]: I1002 16:44:34.760884 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:44:34 crc kubenswrapper[4882]: E1002 16:44:34.761962 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:44:46 crc kubenswrapper[4882]: I1002 16:44:46.760846 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:44:46 crc kubenswrapper[4882]: E1002 16:44:46.761718 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:44:57 crc kubenswrapper[4882]: I1002 16:44:57.760453 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:44:57 crc kubenswrapper[4882]: E1002 16:44:57.761320 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.145499 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx"] Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149140 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerName="extract-utilities" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149190 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerName="extract-utilities" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149241 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149250 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-server" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149261 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-api" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149270 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-api" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149279 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="rsync" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149286 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="rsync" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149298 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149329 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149346 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149353 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-server" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149368 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerName="extract-content" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149375 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerName="extract-content" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149386 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerName="registry-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149421 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerName="registry-server" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149437 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerName="registry-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149444 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerName="registry-server" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149457 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149464 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-server" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149473 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149494 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149505 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-expirer" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149513 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-expirer" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149524 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerName="extract-content" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149532 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerName="extract-content" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149553 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149560 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149571 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149579 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149587 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="swift-recon-cron" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149594 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="swift-recon-cron" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149603 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-httpd" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149611 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-httpd" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149620 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149627 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149640 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7cef74-9b55-4213-a934-7c1d2c058aab" containerName="mariadb-account-delete" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149648 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7cef74-9b55-4213-a934-7c1d2c058aab" containerName="mariadb-account-delete" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149661 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server-init" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149669 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server-init" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149681 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149689 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149705 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerName="extract-utilities" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149712 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerName="extract-utilities" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149725 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149733 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149743 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149751 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149762 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-updater" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149769 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-updater" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149783 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-updater" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149790 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-updater" Oct 02 16:45:00 crc kubenswrapper[4882]: E1002 16:45:00.149800 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-reaper" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149809 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-reaper" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.149990 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150007 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-updater" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150020 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-expirer" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150036 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150050 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1acdb9-23dd-43cd-b568-0e6a04f0db71" containerName="registry-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150063 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="swift-recon-cron" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150073 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150084 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-httpd" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150094 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150104 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-updater" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150119 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="container-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150131 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="e835f425-aa72-4aa1-aadf-b190ea334fe2" containerName="registry-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150142 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150154 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-reaper" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150163 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="object-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150171 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovsdb-server" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150182 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="rsync" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150196 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7cef74-9b55-4213-a934-7c1d2c058aab" containerName="mariadb-account-delete" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150209 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="21667760-8ee1-456b-af11-a501cdf77822" containerName="ovs-vswitchd" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150241 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd93bba-dd83-4256-952a-d60fd3cefef4" containerName="neutron-api" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150252 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-auditor" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.150260 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd26acb-1d48-48f3-b39d-b274bdcd3cce" containerName="account-replicator" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.151172 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.153845 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx"] Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.156729 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.156768 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.340352 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8jn\" (UniqueName: \"kubernetes.io/projected/3a8728be-bc58-48f1-9086-7a29c91b6a36-kube-api-access-hv8jn\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.340466 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a8728be-bc58-48f1-9086-7a29c91b6a36-secret-volume\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.340508 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a8728be-bc58-48f1-9086-7a29c91b6a36-config-volume\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.442083 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a8728be-bc58-48f1-9086-7a29c91b6a36-config-volume\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.442234 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8jn\" (UniqueName: \"kubernetes.io/projected/3a8728be-bc58-48f1-9086-7a29c91b6a36-kube-api-access-hv8jn\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.442311 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a8728be-bc58-48f1-9086-7a29c91b6a36-secret-volume\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.444073 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a8728be-bc58-48f1-9086-7a29c91b6a36-config-volume\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.457124 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a8728be-bc58-48f1-9086-7a29c91b6a36-secret-volume\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.462515 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8jn\" (UniqueName: \"kubernetes.io/projected/3a8728be-bc58-48f1-9086-7a29c91b6a36-kube-api-access-hv8jn\") pod \"collect-profiles-29323725-w8zfx\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.479771 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:00 crc kubenswrapper[4882]: I1002 16:45:00.908487 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx"] Oct 02 16:45:01 crc kubenswrapper[4882]: I1002 16:45:01.408692 4882 generic.go:334] "Generic (PLEG): container finished" podID="3a8728be-bc58-48f1-9086-7a29c91b6a36" containerID="96b79dd775755f49045995a474d1e6bc90a9c3bf738a1909da1083833e260125" exitCode=0 Oct 02 16:45:01 crc kubenswrapper[4882]: I1002 16:45:01.408770 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" event={"ID":"3a8728be-bc58-48f1-9086-7a29c91b6a36","Type":"ContainerDied","Data":"96b79dd775755f49045995a474d1e6bc90a9c3bf738a1909da1083833e260125"} Oct 02 16:45:01 crc kubenswrapper[4882]: I1002 16:45:01.408818 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" event={"ID":"3a8728be-bc58-48f1-9086-7a29c91b6a36","Type":"ContainerStarted","Data":"90644c604d67cadd7383d224ea87e0a0180b3fcb2488ab716fbf64daf8e2e28d"} Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.703782 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.879694 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a8728be-bc58-48f1-9086-7a29c91b6a36-config-volume\") pod \"3a8728be-bc58-48f1-9086-7a29c91b6a36\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.879817 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a8728be-bc58-48f1-9086-7a29c91b6a36-secret-volume\") pod \"3a8728be-bc58-48f1-9086-7a29c91b6a36\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.879905 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv8jn\" (UniqueName: \"kubernetes.io/projected/3a8728be-bc58-48f1-9086-7a29c91b6a36-kube-api-access-hv8jn\") pod \"3a8728be-bc58-48f1-9086-7a29c91b6a36\" (UID: \"3a8728be-bc58-48f1-9086-7a29c91b6a36\") " Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.880592 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a8728be-bc58-48f1-9086-7a29c91b6a36-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a8728be-bc58-48f1-9086-7a29c91b6a36" (UID: "3a8728be-bc58-48f1-9086-7a29c91b6a36"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.885661 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a8728be-bc58-48f1-9086-7a29c91b6a36-kube-api-access-hv8jn" (OuterVolumeSpecName: "kube-api-access-hv8jn") pod "3a8728be-bc58-48f1-9086-7a29c91b6a36" (UID: "3a8728be-bc58-48f1-9086-7a29c91b6a36"). InnerVolumeSpecName "kube-api-access-hv8jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.885708 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a8728be-bc58-48f1-9086-7a29c91b6a36-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a8728be-bc58-48f1-9086-7a29c91b6a36" (UID: "3a8728be-bc58-48f1-9086-7a29c91b6a36"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.982008 4882 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a8728be-bc58-48f1-9086-7a29c91b6a36-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.982244 4882 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a8728be-bc58-48f1-9086-7a29c91b6a36-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 16:45:02 crc kubenswrapper[4882]: I1002 16:45:02.982332 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv8jn\" (UniqueName: \"kubernetes.io/projected/3a8728be-bc58-48f1-9086-7a29c91b6a36-kube-api-access-hv8jn\") on node \"crc\" DevicePath \"\"" Oct 02 16:45:03 crc kubenswrapper[4882]: I1002 16:45:03.427562 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" event={"ID":"3a8728be-bc58-48f1-9086-7a29c91b6a36","Type":"ContainerDied","Data":"90644c604d67cadd7383d224ea87e0a0180b3fcb2488ab716fbf64daf8e2e28d"} Oct 02 16:45:03 crc kubenswrapper[4882]: I1002 16:45:03.427859 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90644c604d67cadd7383d224ea87e0a0180b3fcb2488ab716fbf64daf8e2e28d" Oct 02 16:45:03 crc kubenswrapper[4882]: I1002 16:45:03.427607 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323725-w8zfx" Oct 02 16:45:09 crc kubenswrapper[4882]: I1002 16:45:09.760273 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:45:09 crc kubenswrapper[4882]: E1002 16:45:09.761080 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:45:20 crc kubenswrapper[4882]: I1002 16:45:20.760656 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:45:20 crc kubenswrapper[4882]: E1002 16:45:20.761445 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.729038 4882 scope.go:117] "RemoveContainer" containerID="35f5965b2379272baa64a6b7b2f7efd500aef2a6fe7e1de3b2042137525d5921" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.768637 4882 scope.go:117] "RemoveContainer" containerID="45bb1b9d3f14a231f70b2267ff101ad41616e2eb0158bfb01cf9db8c6d56d8ca" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.800585 4882 scope.go:117] "RemoveContainer" containerID="12b272e78e66a0a2addd30c1a027807a59f148b85e71c5a2b8d9836fe9b8d843" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.819930 4882 scope.go:117] "RemoveContainer" containerID="d5a7ca97a26025df40fa9bac55807a7e950fe94bb5ce25a333f4e5a7efd30156" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.835588 4882 scope.go:117] "RemoveContainer" containerID="e79f0308f09fd876708535b015dec7df2adfc0310031d690c5a22486f2f96d11" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.852929 4882 scope.go:117] "RemoveContainer" containerID="42bd27be6298749bedf886c1fdcb856b9c7bdde2ff094fd0bbcab807ca38fc43" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.878170 4882 scope.go:117] "RemoveContainer" containerID="93e77877e930a932f31f96d8c2b7d4e11b5ab6f70e5fa57cb3caad68f215b647" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.895734 4882 scope.go:117] "RemoveContainer" containerID="e93df0f0131f30a72dba182ec0f2f33b1dc2c6158f2a976f2348f6a3c71cbaaf" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.913301 4882 scope.go:117] "RemoveContainer" containerID="81bedb9aeb702667b8a6d51fc9ff288fc91ae9eafee405c8744188c4fd827eed" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.927052 4882 scope.go:117] "RemoveContainer" containerID="f8ee3cae1e7ab82fde39e0ad973e86431ad255814bd39da4664b826387400de6" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.945994 4882 scope.go:117] "RemoveContainer" containerID="78a144deaf1a06d4548c269702e27808700d9b0044a7206f272d417145ca6e45" Oct 02 16:45:27 crc kubenswrapper[4882]: I1002 16:45:27.984563 4882 scope.go:117] "RemoveContainer" containerID="2581f184636b48853f0d00e8ec8da4e2bfda90cb24a614d3b7f1a2a83bb228d5" Oct 02 16:45:28 crc kubenswrapper[4882]: I1002 16:45:28.002710 4882 scope.go:117] "RemoveContainer" containerID="2d5462c744d32c02349363789c3e1d2d942136eca10c547d25ef94b8fbf2a4c0" Oct 02 16:45:28 crc kubenswrapper[4882]: I1002 16:45:28.021369 4882 scope.go:117] "RemoveContainer" containerID="60dd8ce85cbf8ece34f51a408ff617b2287ea1992d2c01d977fc1479f2c461eb" Oct 02 16:45:28 crc kubenswrapper[4882]: I1002 16:45:28.036420 4882 scope.go:117] "RemoveContainer" containerID="3445a7bd3b9d7e8f1b81bd2cb1baba59e81f2eb716e5ad4d3e54e83b4ed8a5e2" Oct 02 16:45:31 crc kubenswrapper[4882]: I1002 16:45:31.791330 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:45:31 crc kubenswrapper[4882]: E1002 16:45:31.793727 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:45:46 crc kubenswrapper[4882]: I1002 16:45:46.760440 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:45:46 crc kubenswrapper[4882]: E1002 16:45:46.761124 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:46:01 crc kubenswrapper[4882]: I1002 16:46:01.759987 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:46:01 crc kubenswrapper[4882]: E1002 16:46:01.760863 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:46:13 crc kubenswrapper[4882]: I1002 16:46:13.760398 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:46:13 crc kubenswrapper[4882]: E1002 16:46:13.761355 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:46:28 crc kubenswrapper[4882]: I1002 16:46:28.179128 4882 scope.go:117] "RemoveContainer" containerID="7d2d4ce2c2e6521a23764d7d8191e9ba960363470732242e3f515311010cfa5e" Oct 02 16:46:28 crc kubenswrapper[4882]: I1002 16:46:28.208888 4882 scope.go:117] "RemoveContainer" containerID="f281d92386df9b537dc64cd1c262d2a95b35ba4313d97c921f09342506f64857" Oct 02 16:46:28 crc kubenswrapper[4882]: I1002 16:46:28.237158 4882 scope.go:117] "RemoveContainer" containerID="5d74ea1886bb363a04bd54bc89d361229a325d82664628a7f70cd69449c9b60a" Oct 02 16:46:28 crc kubenswrapper[4882]: I1002 16:46:28.292689 4882 scope.go:117] "RemoveContainer" containerID="cd9314b6deb15a5ba62c52e608381c59d8b1a515fc6f2ae8581e6de0e466c344" Oct 02 16:46:28 crc kubenswrapper[4882]: I1002 16:46:28.335035 4882 scope.go:117] "RemoveContainer" containerID="e224cf765658a9d38193f466e570d0a94987194efa3f1b60bb77d3ab823c7874" Oct 02 16:46:28 crc kubenswrapper[4882]: I1002 16:46:28.760757 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:46:28 crc kubenswrapper[4882]: E1002 16:46:28.761030 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:46:39 crc kubenswrapper[4882]: I1002 16:46:39.761241 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:46:39 crc kubenswrapper[4882]: E1002 16:46:39.762075 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:46:52 crc kubenswrapper[4882]: I1002 16:46:52.765524 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:46:52 crc kubenswrapper[4882]: E1002 16:46:52.766264 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:47:03 crc kubenswrapper[4882]: I1002 16:47:03.759779 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:47:03 crc kubenswrapper[4882]: E1002 16:47:03.760579 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:47:15 crc kubenswrapper[4882]: I1002 16:47:15.760672 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:47:15 crc kubenswrapper[4882]: E1002 16:47:15.761814 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:47:28 crc kubenswrapper[4882]: I1002 16:47:28.451655 4882 scope.go:117] "RemoveContainer" containerID="e2ff269e176bc20d80aa95fcdf36998283cbc95b8397ab30cf540ffd4bf2d588" Oct 02 16:47:28 crc kubenswrapper[4882]: I1002 16:47:28.485057 4882 scope.go:117] "RemoveContainer" containerID="513b374180132a0f297498712d1e51739fc2ad0c26615a542f982ac395df9d3e" Oct 02 16:47:28 crc kubenswrapper[4882]: I1002 16:47:28.513052 4882 scope.go:117] "RemoveContainer" containerID="1f23372a0ffce94576fa7b8b12f78b5723d92783fb82f3c3a11c09be97afc71e" Oct 02 16:47:28 crc kubenswrapper[4882]: I1002 16:47:28.536326 4882 scope.go:117] "RemoveContainer" containerID="9a942b8c17e8463a8e6f76d1513de147a7fbccbc3c6bce75a888d236140df363" Oct 02 16:47:28 crc kubenswrapper[4882]: I1002 16:47:28.567149 4882 scope.go:117] "RemoveContainer" containerID="bcb72f41583b2dc39fc26211b1de727dd1fbe88709748d3f9c2182967e02454f" Oct 02 16:47:28 crc kubenswrapper[4882]: I1002 16:47:28.617731 4882 scope.go:117] "RemoveContainer" containerID="2d2f7b7472a6b146044b414ca3cbab36ecb87a0c178a2b350803295c0549c80d" Oct 02 16:47:28 crc kubenswrapper[4882]: I1002 16:47:28.637741 4882 scope.go:117] "RemoveContainer" containerID="94742d7620ee6e9af50f0bdad8f628d680e7d044141c67f87aac9ef2bfa64a8b" Oct 02 16:47:31 crc kubenswrapper[4882]: I1002 16:47:31.760132 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:47:31 crc kubenswrapper[4882]: E1002 16:47:31.760824 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:47:45 crc kubenswrapper[4882]: I1002 16:47:45.760278 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:47:45 crc kubenswrapper[4882]: E1002 16:47:45.761121 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:48:00 crc kubenswrapper[4882]: I1002 16:48:00.761955 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:48:00 crc kubenswrapper[4882]: E1002 16:48:00.764707 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:48:14 crc kubenswrapper[4882]: I1002 16:48:14.760477 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:48:14 crc kubenswrapper[4882]: E1002 16:48:14.761252 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:48:28 crc kubenswrapper[4882]: I1002 16:48:28.712123 4882 scope.go:117] "RemoveContainer" containerID="1f6b1ba588353fa4b48eb5a0e6cd51674da94c25662edd41a4ae4d86d2142347" Oct 02 16:48:28 crc kubenswrapper[4882]: I1002 16:48:28.738644 4882 scope.go:117] "RemoveContainer" containerID="0dbcbc8ee3a0c5fdc9984a4b48b1e902f6d70fc302ca3885063927617a9dc73e" Oct 02 16:48:28 crc kubenswrapper[4882]: I1002 16:48:28.760900 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:48:28 crc kubenswrapper[4882]: E1002 16:48:28.762134 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:48:40 crc kubenswrapper[4882]: I1002 16:48:40.760797 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:48:41 crc kubenswrapper[4882]: I1002 16:48:41.322581 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"58c1b1c521b62f518e400e7167d0717c7838a933e2da2419084f725f5f609d17"} Oct 02 16:51:09 crc kubenswrapper[4882]: I1002 16:51:09.390996 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:51:09 crc kubenswrapper[4882]: I1002 16:51:09.391731 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:51:39 crc kubenswrapper[4882]: I1002 16:51:39.390602 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:51:39 crc kubenswrapper[4882]: I1002 16:51:39.391069 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.398517 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-drzvf"] Oct 02 16:51:51 crc kubenswrapper[4882]: E1002 16:51:51.399325 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8728be-bc58-48f1-9086-7a29c91b6a36" containerName="collect-profiles" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.399337 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8728be-bc58-48f1-9086-7a29c91b6a36" containerName="collect-profiles" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.399472 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a8728be-bc58-48f1-9086-7a29c91b6a36" containerName="collect-profiles" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.400463 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.409337 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-drzvf"] Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.499697 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-catalog-content\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.499765 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-utilities\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.500088 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhkx\" (UniqueName: \"kubernetes.io/projected/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-kube-api-access-5hhkx\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.601336 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-catalog-content\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.601405 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-utilities\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.601450 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhkx\" (UniqueName: \"kubernetes.io/projected/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-kube-api-access-5hhkx\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.601899 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-catalog-content\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.602198 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-utilities\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.640203 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhkx\" (UniqueName: \"kubernetes.io/projected/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-kube-api-access-5hhkx\") pod \"redhat-operators-drzvf\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.723306 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:51:51 crc kubenswrapper[4882]: I1002 16:51:51.967531 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-drzvf"] Oct 02 16:51:52 crc kubenswrapper[4882]: I1002 16:51:52.888464 4882 generic.go:334] "Generic (PLEG): container finished" podID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerID="4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761" exitCode=0 Oct 02 16:51:52 crc kubenswrapper[4882]: I1002 16:51:52.888537 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drzvf" event={"ID":"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7","Type":"ContainerDied","Data":"4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761"} Oct 02 16:51:52 crc kubenswrapper[4882]: I1002 16:51:52.888930 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drzvf" event={"ID":"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7","Type":"ContainerStarted","Data":"259dbf375c349ccb0f4dfa06f901f17c8159904e5b9d137a5dfe92f722116cf2"} Oct 02 16:51:52 crc kubenswrapper[4882]: I1002 16:51:52.890533 4882 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 16:51:53 crc kubenswrapper[4882]: I1002 16:51:53.899480 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drzvf" event={"ID":"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7","Type":"ContainerStarted","Data":"54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d"} Oct 02 16:51:54 crc kubenswrapper[4882]: I1002 16:51:54.917764 4882 generic.go:334] "Generic (PLEG): container finished" podID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerID="54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d" exitCode=0 Oct 02 16:51:54 crc kubenswrapper[4882]: I1002 16:51:54.917819 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drzvf" event={"ID":"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7","Type":"ContainerDied","Data":"54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d"} Oct 02 16:51:55 crc kubenswrapper[4882]: I1002 16:51:55.926968 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drzvf" event={"ID":"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7","Type":"ContainerStarted","Data":"01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325"} Oct 02 16:51:55 crc kubenswrapper[4882]: I1002 16:51:55.950093 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-drzvf" podStartSLOduration=2.308092368 podStartE2EDuration="4.950069097s" podCreationTimestamp="2025-10-02 16:51:51 +0000 UTC" firstStartedPulling="2025-10-02 16:51:52.889955039 +0000 UTC m=+2071.639184606" lastFinishedPulling="2025-10-02 16:51:55.531931808 +0000 UTC m=+2074.281161335" observedRunningTime="2025-10-02 16:51:55.9433907 +0000 UTC m=+2074.692620247" watchObservedRunningTime="2025-10-02 16:51:55.950069097 +0000 UTC m=+2074.699298634" Oct 02 16:51:59 crc kubenswrapper[4882]: I1002 16:51:59.875738 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fs2n5"] Oct 02 16:51:59 crc kubenswrapper[4882]: I1002 16:51:59.878089 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:51:59 crc kubenswrapper[4882]: I1002 16:51:59.898782 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fs2n5"] Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.024684 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-utilities\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.024747 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-catalog-content\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.024768 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drl75\" (UniqueName: \"kubernetes.io/projected/9d8d69da-cbdd-49e4-9622-79485ccd337b-kube-api-access-drl75\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.126094 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-utilities\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.126167 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-catalog-content\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.126192 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drl75\" (UniqueName: \"kubernetes.io/projected/9d8d69da-cbdd-49e4-9622-79485ccd337b-kube-api-access-drl75\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.126739 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-catalog-content\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.126758 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-utilities\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.150392 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drl75\" (UniqueName: \"kubernetes.io/projected/9d8d69da-cbdd-49e4-9622-79485ccd337b-kube-api-access-drl75\") pod \"certified-operators-fs2n5\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.210431 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.688983 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fs2n5"] Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.969399 4882 generic.go:334] "Generic (PLEG): container finished" podID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerID="578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a" exitCode=0 Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.969476 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs2n5" event={"ID":"9d8d69da-cbdd-49e4-9622-79485ccd337b","Type":"ContainerDied","Data":"578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a"} Oct 02 16:52:00 crc kubenswrapper[4882]: I1002 16:52:00.969700 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs2n5" event={"ID":"9d8d69da-cbdd-49e4-9622-79485ccd337b","Type":"ContainerStarted","Data":"ab6b2148d17a58a0fcfc8979c272cb6e9f6e1ad8935050ec9aa38dc81aaeb9f5"} Oct 02 16:52:01 crc kubenswrapper[4882]: I1002 16:52:01.723598 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:52:01 crc kubenswrapper[4882]: I1002 16:52:01.723673 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:52:01 crc kubenswrapper[4882]: I1002 16:52:01.792883 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:52:01 crc kubenswrapper[4882]: I1002 16:52:01.982737 4882 generic.go:334] "Generic (PLEG): container finished" podID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerID="e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc" exitCode=0 Oct 02 16:52:01 crc kubenswrapper[4882]: I1002 16:52:01.982802 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs2n5" event={"ID":"9d8d69da-cbdd-49e4-9622-79485ccd337b","Type":"ContainerDied","Data":"e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc"} Oct 02 16:52:02 crc kubenswrapper[4882]: I1002 16:52:02.043238 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:52:02 crc kubenswrapper[4882]: I1002 16:52:02.994377 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs2n5" event={"ID":"9d8d69da-cbdd-49e4-9622-79485ccd337b","Type":"ContainerStarted","Data":"4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230"} Oct 02 16:52:03 crc kubenswrapper[4882]: I1002 16:52:03.020035 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fs2n5" podStartSLOduration=2.55248574 podStartE2EDuration="4.020018146s" podCreationTimestamp="2025-10-02 16:51:59 +0000 UTC" firstStartedPulling="2025-10-02 16:52:00.972581341 +0000 UTC m=+2079.721810868" lastFinishedPulling="2025-10-02 16:52:02.440113757 +0000 UTC m=+2081.189343274" observedRunningTime="2025-10-02 16:52:03.016630692 +0000 UTC m=+2081.765860219" watchObservedRunningTime="2025-10-02 16:52:03.020018146 +0000 UTC m=+2081.769247673" Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.051965 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-drzvf"] Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.052466 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-drzvf" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerName="registry-server" containerID="cri-o://01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325" gracePeriod=2 Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.469746 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.595998 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hhkx\" (UniqueName: \"kubernetes.io/projected/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-kube-api-access-5hhkx\") pod \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.596093 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-utilities\") pod \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.596205 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-catalog-content\") pod \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\" (UID: \"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7\") " Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.597742 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-utilities" (OuterVolumeSpecName: "utilities") pod "9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" (UID: "9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.604355 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-kube-api-access-5hhkx" (OuterVolumeSpecName: "kube-api-access-5hhkx") pod "9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" (UID: "9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7"). InnerVolumeSpecName "kube-api-access-5hhkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.697681 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hhkx\" (UniqueName: \"kubernetes.io/projected/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-kube-api-access-5hhkx\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:04 crc kubenswrapper[4882]: I1002 16:52:04.697723 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.017458 4882 generic.go:334] "Generic (PLEG): container finished" podID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerID="01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325" exitCode=0 Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.017562 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drzvf" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.017584 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drzvf" event={"ID":"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7","Type":"ContainerDied","Data":"01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325"} Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.018119 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drzvf" event={"ID":"9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7","Type":"ContainerDied","Data":"259dbf375c349ccb0f4dfa06f901f17c8159904e5b9d137a5dfe92f722116cf2"} Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.018174 4882 scope.go:117] "RemoveContainer" containerID="01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.051311 4882 scope.go:117] "RemoveContainer" containerID="54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.068571 4882 scope.go:117] "RemoveContainer" containerID="4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.095253 4882 scope.go:117] "RemoveContainer" containerID="01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325" Oct 02 16:52:05 crc kubenswrapper[4882]: E1002 16:52:05.095985 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325\": container with ID starting with 01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325 not found: ID does not exist" containerID="01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.096035 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325"} err="failed to get container status \"01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325\": rpc error: code = NotFound desc = could not find container \"01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325\": container with ID starting with 01fe442b6ce4b1c241ad73d0f8c2f8b3aaddd3eb227833421c09b5e3b7930325 not found: ID does not exist" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.096070 4882 scope.go:117] "RemoveContainer" containerID="54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d" Oct 02 16:52:05 crc kubenswrapper[4882]: E1002 16:52:05.096679 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d\": container with ID starting with 54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d not found: ID does not exist" containerID="54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.096734 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d"} err="failed to get container status \"54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d\": rpc error: code = NotFound desc = could not find container \"54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d\": container with ID starting with 54724fb181447278276cd6f7bded7719240bbe8caa1b6d64db903df7ae71278d not found: ID does not exist" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.096767 4882 scope.go:117] "RemoveContainer" containerID="4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761" Oct 02 16:52:05 crc kubenswrapper[4882]: E1002 16:52:05.097285 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761\": container with ID starting with 4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761 not found: ID does not exist" containerID="4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.097327 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761"} err="failed to get container status \"4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761\": rpc error: code = NotFound desc = could not find container \"4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761\": container with ID starting with 4d524ad0e3d0f00adfcc79dc8b5c11962a5a4e89ddcff726b93a82ae1e571761 not found: ID does not exist" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.357275 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" (UID: "9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.409000 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.658097 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-drzvf"] Oct 02 16:52:05 crc kubenswrapper[4882]: I1002 16:52:05.663391 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-drzvf"] Oct 02 16:52:06 crc kubenswrapper[4882]: I1002 16:52:06.772194 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" path="/var/lib/kubelet/pods/9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7/volumes" Oct 02 16:52:09 crc kubenswrapper[4882]: I1002 16:52:09.390648 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:52:09 crc kubenswrapper[4882]: I1002 16:52:09.391080 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:52:09 crc kubenswrapper[4882]: I1002 16:52:09.391158 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:52:09 crc kubenswrapper[4882]: I1002 16:52:09.392116 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58c1b1c521b62f518e400e7167d0717c7838a933e2da2419084f725f5f609d17"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:52:09 crc kubenswrapper[4882]: I1002 16:52:09.392264 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://58c1b1c521b62f518e400e7167d0717c7838a933e2da2419084f725f5f609d17" gracePeriod=600 Oct 02 16:52:10 crc kubenswrapper[4882]: I1002 16:52:10.063020 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="58c1b1c521b62f518e400e7167d0717c7838a933e2da2419084f725f5f609d17" exitCode=0 Oct 02 16:52:10 crc kubenswrapper[4882]: I1002 16:52:10.063382 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"58c1b1c521b62f518e400e7167d0717c7838a933e2da2419084f725f5f609d17"} Oct 02 16:52:10 crc kubenswrapper[4882]: I1002 16:52:10.063529 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3"} Oct 02 16:52:10 crc kubenswrapper[4882]: I1002 16:52:10.063560 4882 scope.go:117] "RemoveContainer" containerID="82767a95e2f16f0a619e7007b334d99b02332a09ae5dd25e6c8c7b0b0d4a6599" Oct 02 16:52:10 crc kubenswrapper[4882]: I1002 16:52:10.210924 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:10 crc kubenswrapper[4882]: I1002 16:52:10.211049 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:10 crc kubenswrapper[4882]: I1002 16:52:10.286502 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:11 crc kubenswrapper[4882]: I1002 16:52:11.124521 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:11 crc kubenswrapper[4882]: I1002 16:52:11.172780 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fs2n5"] Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.096125 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fs2n5" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerName="registry-server" containerID="cri-o://4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230" gracePeriod=2 Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.575766 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.638574 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drl75\" (UniqueName: \"kubernetes.io/projected/9d8d69da-cbdd-49e4-9622-79485ccd337b-kube-api-access-drl75\") pod \"9d8d69da-cbdd-49e4-9622-79485ccd337b\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.638684 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-utilities\") pod \"9d8d69da-cbdd-49e4-9622-79485ccd337b\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.638811 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-catalog-content\") pod \"9d8d69da-cbdd-49e4-9622-79485ccd337b\" (UID: \"9d8d69da-cbdd-49e4-9622-79485ccd337b\") " Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.640371 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-utilities" (OuterVolumeSpecName: "utilities") pod "9d8d69da-cbdd-49e4-9622-79485ccd337b" (UID: "9d8d69da-cbdd-49e4-9622-79485ccd337b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.646528 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8d69da-cbdd-49e4-9622-79485ccd337b-kube-api-access-drl75" (OuterVolumeSpecName: "kube-api-access-drl75") pod "9d8d69da-cbdd-49e4-9622-79485ccd337b" (UID: "9d8d69da-cbdd-49e4-9622-79485ccd337b"). InnerVolumeSpecName "kube-api-access-drl75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.693762 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d8d69da-cbdd-49e4-9622-79485ccd337b" (UID: "9d8d69da-cbdd-49e4-9622-79485ccd337b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.740983 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drl75\" (UniqueName: \"kubernetes.io/projected/9d8d69da-cbdd-49e4-9622-79485ccd337b-kube-api-access-drl75\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.741017 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.741027 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d8d69da-cbdd-49e4-9622-79485ccd337b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.942341 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t8w82"] Oct 02 16:52:13 crc kubenswrapper[4882]: E1002 16:52:13.942734 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerName="extract-content" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.942758 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerName="extract-content" Oct 02 16:52:13 crc kubenswrapper[4882]: E1002 16:52:13.942770 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerName="registry-server" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.942778 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerName="registry-server" Oct 02 16:52:13 crc kubenswrapper[4882]: E1002 16:52:13.942790 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerName="registry-server" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.942799 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerName="registry-server" Oct 02 16:52:13 crc kubenswrapper[4882]: E1002 16:52:13.942814 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerName="extract-utilities" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.942822 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerName="extract-utilities" Oct 02 16:52:13 crc kubenswrapper[4882]: E1002 16:52:13.942832 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerName="extract-utilities" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.942839 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerName="extract-utilities" Oct 02 16:52:13 crc kubenswrapper[4882]: E1002 16:52:13.942861 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerName="extract-content" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.942868 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerName="extract-content" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.943010 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e274ff3-4cd7-4b5c-b5aa-56dd7ce0dcd7" containerName="registry-server" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.943023 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerName="registry-server" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.944101 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:13 crc kubenswrapper[4882]: I1002 16:52:13.962099 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8w82"] Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.044631 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pmmh\" (UniqueName: \"kubernetes.io/projected/9126f519-1686-415c-a2e2-25723a75a50b-kube-api-access-5pmmh\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.044744 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-utilities\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.044961 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-catalog-content\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.105331 4882 generic.go:334] "Generic (PLEG): container finished" podID="9d8d69da-cbdd-49e4-9622-79485ccd337b" containerID="4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230" exitCode=0 Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.105488 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs2n5" event={"ID":"9d8d69da-cbdd-49e4-9622-79485ccd337b","Type":"ContainerDied","Data":"4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230"} Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.105573 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fs2n5" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.106426 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fs2n5" event={"ID":"9d8d69da-cbdd-49e4-9622-79485ccd337b","Type":"ContainerDied","Data":"ab6b2148d17a58a0fcfc8979c272cb6e9f6e1ad8935050ec9aa38dc81aaeb9f5"} Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.106476 4882 scope.go:117] "RemoveContainer" containerID="4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.133515 4882 scope.go:117] "RemoveContainer" containerID="e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.136492 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fs2n5"] Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.142892 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fs2n5"] Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.146763 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pmmh\" (UniqueName: \"kubernetes.io/projected/9126f519-1686-415c-a2e2-25723a75a50b-kube-api-access-5pmmh\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.146821 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-utilities\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.146894 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-catalog-content\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.147386 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-utilities\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.147421 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-catalog-content\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.166161 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pmmh\" (UniqueName: \"kubernetes.io/projected/9126f519-1686-415c-a2e2-25723a75a50b-kube-api-access-5pmmh\") pod \"redhat-marketplace-t8w82\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.181103 4882 scope.go:117] "RemoveContainer" containerID="578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.205439 4882 scope.go:117] "RemoveContainer" containerID="4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230" Oct 02 16:52:14 crc kubenswrapper[4882]: E1002 16:52:14.206114 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230\": container with ID starting with 4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230 not found: ID does not exist" containerID="4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.206194 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230"} err="failed to get container status \"4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230\": rpc error: code = NotFound desc = could not find container \"4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230\": container with ID starting with 4a184215fce6df072235e628b66c4808b74e5db219c4d70abdfacb70eb6b6230 not found: ID does not exist" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.206262 4882 scope.go:117] "RemoveContainer" containerID="e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc" Oct 02 16:52:14 crc kubenswrapper[4882]: E1002 16:52:14.206779 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc\": container with ID starting with e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc not found: ID does not exist" containerID="e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.206836 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc"} err="failed to get container status \"e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc\": rpc error: code = NotFound desc = could not find container \"e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc\": container with ID starting with e89260a5624a20160c132a4a16cdef49086ffa779e3a45ad19fd01cf2f589adc not found: ID does not exist" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.206871 4882 scope.go:117] "RemoveContainer" containerID="578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a" Oct 02 16:52:14 crc kubenswrapper[4882]: E1002 16:52:14.207254 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a\": container with ID starting with 578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a not found: ID does not exist" containerID="578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.207304 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a"} err="failed to get container status \"578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a\": rpc error: code = NotFound desc = could not find container \"578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a\": container with ID starting with 578f928f0bddfd44db2f34b00cc0e7b05bf8eeb4383bd08c14a1a9ba4fe4db4a not found: ID does not exist" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.266845 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.712461 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8w82"] Oct 02 16:52:14 crc kubenswrapper[4882]: I1002 16:52:14.777721 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8d69da-cbdd-49e4-9622-79485ccd337b" path="/var/lib/kubelet/pods/9d8d69da-cbdd-49e4-9622-79485ccd337b/volumes" Oct 02 16:52:15 crc kubenswrapper[4882]: I1002 16:52:15.119009 4882 generic.go:334] "Generic (PLEG): container finished" podID="9126f519-1686-415c-a2e2-25723a75a50b" containerID="74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9" exitCode=0 Oct 02 16:52:15 crc kubenswrapper[4882]: I1002 16:52:15.119075 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8w82" event={"ID":"9126f519-1686-415c-a2e2-25723a75a50b","Type":"ContainerDied","Data":"74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9"} Oct 02 16:52:15 crc kubenswrapper[4882]: I1002 16:52:15.119134 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8w82" event={"ID":"9126f519-1686-415c-a2e2-25723a75a50b","Type":"ContainerStarted","Data":"bee790463a743b3c5751d3f56cc490f95848863e9ffd53784ba80d8e32ed2c06"} Oct 02 16:52:16 crc kubenswrapper[4882]: I1002 16:52:16.126476 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8w82" event={"ID":"9126f519-1686-415c-a2e2-25723a75a50b","Type":"ContainerStarted","Data":"ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854"} Oct 02 16:52:17 crc kubenswrapper[4882]: I1002 16:52:17.137805 4882 generic.go:334] "Generic (PLEG): container finished" podID="9126f519-1686-415c-a2e2-25723a75a50b" containerID="ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854" exitCode=0 Oct 02 16:52:17 crc kubenswrapper[4882]: I1002 16:52:17.137877 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8w82" event={"ID":"9126f519-1686-415c-a2e2-25723a75a50b","Type":"ContainerDied","Data":"ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854"} Oct 02 16:52:18 crc kubenswrapper[4882]: I1002 16:52:18.146320 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8w82" event={"ID":"9126f519-1686-415c-a2e2-25723a75a50b","Type":"ContainerStarted","Data":"502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28"} Oct 02 16:52:18 crc kubenswrapper[4882]: I1002 16:52:18.174968 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t8w82" podStartSLOduration=2.627811221 podStartE2EDuration="5.174947297s" podCreationTimestamp="2025-10-02 16:52:13 +0000 UTC" firstStartedPulling="2025-10-02 16:52:15.12048059 +0000 UTC m=+2093.869710147" lastFinishedPulling="2025-10-02 16:52:17.667616656 +0000 UTC m=+2096.416846223" observedRunningTime="2025-10-02 16:52:18.168606869 +0000 UTC m=+2096.917836456" watchObservedRunningTime="2025-10-02 16:52:18.174947297 +0000 UTC m=+2096.924176844" Oct 02 16:52:24 crc kubenswrapper[4882]: I1002 16:52:24.268488 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:24 crc kubenswrapper[4882]: I1002 16:52:24.269189 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:24 crc kubenswrapper[4882]: I1002 16:52:24.323071 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.079425 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ht6jn"] Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.081127 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.102463 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ht6jn"] Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.109576 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-catalog-content\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.109674 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4ndq\" (UniqueName: \"kubernetes.io/projected/22fbd960-b1f0-4bee-8d96-0f448472e4ce-kube-api-access-p4ndq\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.109908 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-utilities\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.216438 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-catalog-content\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.216555 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ndq\" (UniqueName: \"kubernetes.io/projected/22fbd960-b1f0-4bee-8d96-0f448472e4ce-kube-api-access-p4ndq\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.216595 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-utilities\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.216981 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-catalog-content\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.217004 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-utilities\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.244553 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4ndq\" (UniqueName: \"kubernetes.io/projected/22fbd960-b1f0-4bee-8d96-0f448472e4ce-kube-api-access-p4ndq\") pod \"community-operators-ht6jn\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.253521 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.410289 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:25 crc kubenswrapper[4882]: I1002 16:52:25.749434 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ht6jn"] Oct 02 16:52:25 crc kubenswrapper[4882]: W1002 16:52:25.754268 4882 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22fbd960_b1f0_4bee_8d96_0f448472e4ce.slice/crio-9618ae2426fa4b064f2088cdeee6d25a812f4dc518073c8e6b2b51fd541325a0 WatchSource:0}: Error finding container 9618ae2426fa4b064f2088cdeee6d25a812f4dc518073c8e6b2b51fd541325a0: Status 404 returned error can't find the container with id 9618ae2426fa4b064f2088cdeee6d25a812f4dc518073c8e6b2b51fd541325a0 Oct 02 16:52:26 crc kubenswrapper[4882]: I1002 16:52:26.213191 4882 generic.go:334] "Generic (PLEG): container finished" podID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerID="96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c" exitCode=0 Oct 02 16:52:26 crc kubenswrapper[4882]: I1002 16:52:26.213250 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht6jn" event={"ID":"22fbd960-b1f0-4bee-8d96-0f448472e4ce","Type":"ContainerDied","Data":"96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c"} Oct 02 16:52:26 crc kubenswrapper[4882]: I1002 16:52:26.213290 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht6jn" event={"ID":"22fbd960-b1f0-4bee-8d96-0f448472e4ce","Type":"ContainerStarted","Data":"9618ae2426fa4b064f2088cdeee6d25a812f4dc518073c8e6b2b51fd541325a0"} Oct 02 16:52:27 crc kubenswrapper[4882]: I1002 16:52:27.224721 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht6jn" event={"ID":"22fbd960-b1f0-4bee-8d96-0f448472e4ce","Type":"ContainerStarted","Data":"c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c"} Oct 02 16:52:27 crc kubenswrapper[4882]: I1002 16:52:27.555160 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8w82"] Oct 02 16:52:27 crc kubenswrapper[4882]: I1002 16:52:27.555399 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t8w82" podUID="9126f519-1686-415c-a2e2-25723a75a50b" containerName="registry-server" containerID="cri-o://502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28" gracePeriod=2 Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.005344 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.058755 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-utilities\") pod \"9126f519-1686-415c-a2e2-25723a75a50b\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.058911 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pmmh\" (UniqueName: \"kubernetes.io/projected/9126f519-1686-415c-a2e2-25723a75a50b-kube-api-access-5pmmh\") pod \"9126f519-1686-415c-a2e2-25723a75a50b\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.058931 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-catalog-content\") pod \"9126f519-1686-415c-a2e2-25723a75a50b\" (UID: \"9126f519-1686-415c-a2e2-25723a75a50b\") " Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.059858 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-utilities" (OuterVolumeSpecName: "utilities") pod "9126f519-1686-415c-a2e2-25723a75a50b" (UID: "9126f519-1686-415c-a2e2-25723a75a50b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.065249 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9126f519-1686-415c-a2e2-25723a75a50b-kube-api-access-5pmmh" (OuterVolumeSpecName: "kube-api-access-5pmmh") pod "9126f519-1686-415c-a2e2-25723a75a50b" (UID: "9126f519-1686-415c-a2e2-25723a75a50b"). InnerVolumeSpecName "kube-api-access-5pmmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.078527 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9126f519-1686-415c-a2e2-25723a75a50b" (UID: "9126f519-1686-415c-a2e2-25723a75a50b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.160860 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pmmh\" (UniqueName: \"kubernetes.io/projected/9126f519-1686-415c-a2e2-25723a75a50b-kube-api-access-5pmmh\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.160929 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.160947 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9126f519-1686-415c-a2e2-25723a75a50b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.235828 4882 generic.go:334] "Generic (PLEG): container finished" podID="9126f519-1686-415c-a2e2-25723a75a50b" containerID="502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28" exitCode=0 Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.235928 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8w82" event={"ID":"9126f519-1686-415c-a2e2-25723a75a50b","Type":"ContainerDied","Data":"502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28"} Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.236006 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8w82" event={"ID":"9126f519-1686-415c-a2e2-25723a75a50b","Type":"ContainerDied","Data":"bee790463a743b3c5751d3f56cc490f95848863e9ffd53784ba80d8e32ed2c06"} Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.236029 4882 scope.go:117] "RemoveContainer" containerID="502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.236988 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8w82" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.238300 4882 generic.go:334] "Generic (PLEG): container finished" podID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerID="c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c" exitCode=0 Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.238338 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht6jn" event={"ID":"22fbd960-b1f0-4bee-8d96-0f448472e4ce","Type":"ContainerDied","Data":"c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c"} Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.259745 4882 scope.go:117] "RemoveContainer" containerID="ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.287774 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8w82"] Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.298840 4882 scope.go:117] "RemoveContainer" containerID="74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.299826 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8w82"] Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.315741 4882 scope.go:117] "RemoveContainer" containerID="502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28" Oct 02 16:52:28 crc kubenswrapper[4882]: E1002 16:52:28.316263 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28\": container with ID starting with 502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28 not found: ID does not exist" containerID="502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.316319 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28"} err="failed to get container status \"502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28\": rpc error: code = NotFound desc = could not find container \"502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28\": container with ID starting with 502154e2ee738489b821414156634847c80ded41a10ef87db59342c32833ee28 not found: ID does not exist" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.316354 4882 scope.go:117] "RemoveContainer" containerID="ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854" Oct 02 16:52:28 crc kubenswrapper[4882]: E1002 16:52:28.316662 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854\": container with ID starting with ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854 not found: ID does not exist" containerID="ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.316693 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854"} err="failed to get container status \"ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854\": rpc error: code = NotFound desc = could not find container \"ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854\": container with ID starting with ea088bb4913624c6ce9ad54436deac0c5142dacdf55e9f08e7dd14cf3866f854 not found: ID does not exist" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.316715 4882 scope.go:117] "RemoveContainer" containerID="74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9" Oct 02 16:52:28 crc kubenswrapper[4882]: E1002 16:52:28.316958 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9\": container with ID starting with 74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9 not found: ID does not exist" containerID="74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.316987 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9"} err="failed to get container status \"74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9\": rpc error: code = NotFound desc = could not find container \"74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9\": container with ID starting with 74bcd5196eb4215aa847b9a5906082bd8b1ff9819c60a964f71d7bba5af6d1f9 not found: ID does not exist" Oct 02 16:52:28 crc kubenswrapper[4882]: I1002 16:52:28.770354 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9126f519-1686-415c-a2e2-25723a75a50b" path="/var/lib/kubelet/pods/9126f519-1686-415c-a2e2-25723a75a50b/volumes" Oct 02 16:52:29 crc kubenswrapper[4882]: I1002 16:52:29.250905 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht6jn" event={"ID":"22fbd960-b1f0-4bee-8d96-0f448472e4ce","Type":"ContainerStarted","Data":"6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7"} Oct 02 16:52:29 crc kubenswrapper[4882]: I1002 16:52:29.279710 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ht6jn" podStartSLOduration=1.847501708 podStartE2EDuration="4.27968871s" podCreationTimestamp="2025-10-02 16:52:25 +0000 UTC" firstStartedPulling="2025-10-02 16:52:26.214796623 +0000 UTC m=+2104.964026160" lastFinishedPulling="2025-10-02 16:52:28.646983595 +0000 UTC m=+2107.396213162" observedRunningTime="2025-10-02 16:52:29.276559282 +0000 UTC m=+2108.025788829" watchObservedRunningTime="2025-10-02 16:52:29.27968871 +0000 UTC m=+2108.028918257" Oct 02 16:52:35 crc kubenswrapper[4882]: I1002 16:52:35.410848 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:35 crc kubenswrapper[4882]: I1002 16:52:35.411477 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:35 crc kubenswrapper[4882]: I1002 16:52:35.499706 4882 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:36 crc kubenswrapper[4882]: I1002 16:52:36.375155 4882 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:36 crc kubenswrapper[4882]: I1002 16:52:36.436849 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ht6jn"] Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.327207 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ht6jn" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerName="registry-server" containerID="cri-o://6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7" gracePeriod=2 Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.745462 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.814015 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-utilities\") pod \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.814074 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4ndq\" (UniqueName: \"kubernetes.io/projected/22fbd960-b1f0-4bee-8d96-0f448472e4ce-kube-api-access-p4ndq\") pod \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.814131 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-catalog-content\") pod \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\" (UID: \"22fbd960-b1f0-4bee-8d96-0f448472e4ce\") " Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.816139 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-utilities" (OuterVolumeSpecName: "utilities") pod "22fbd960-b1f0-4bee-8d96-0f448472e4ce" (UID: "22fbd960-b1f0-4bee-8d96-0f448472e4ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.820722 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22fbd960-b1f0-4bee-8d96-0f448472e4ce-kube-api-access-p4ndq" (OuterVolumeSpecName: "kube-api-access-p4ndq") pod "22fbd960-b1f0-4bee-8d96-0f448472e4ce" (UID: "22fbd960-b1f0-4bee-8d96-0f448472e4ce"). InnerVolumeSpecName "kube-api-access-p4ndq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.878665 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22fbd960-b1f0-4bee-8d96-0f448472e4ce" (UID: "22fbd960-b1f0-4bee-8d96-0f448472e4ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.916598 4882 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.916630 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4ndq\" (UniqueName: \"kubernetes.io/projected/22fbd960-b1f0-4bee-8d96-0f448472e4ce-kube-api-access-p4ndq\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:38 crc kubenswrapper[4882]: I1002 16:52:38.916639 4882 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22fbd960-b1f0-4bee-8d96-0f448472e4ce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.338302 4882 generic.go:334] "Generic (PLEG): container finished" podID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerID="6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7" exitCode=0 Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.338380 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ht6jn" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.338383 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht6jn" event={"ID":"22fbd960-b1f0-4bee-8d96-0f448472e4ce","Type":"ContainerDied","Data":"6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7"} Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.338516 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ht6jn" event={"ID":"22fbd960-b1f0-4bee-8d96-0f448472e4ce","Type":"ContainerDied","Data":"9618ae2426fa4b064f2088cdeee6d25a812f4dc518073c8e6b2b51fd541325a0"} Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.338537 4882 scope.go:117] "RemoveContainer" containerID="6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.398183 4882 scope.go:117] "RemoveContainer" containerID="c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.400452 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ht6jn"] Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.405513 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ht6jn"] Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.436706 4882 scope.go:117] "RemoveContainer" containerID="96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.462013 4882 scope.go:117] "RemoveContainer" containerID="6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7" Oct 02 16:52:39 crc kubenswrapper[4882]: E1002 16:52:39.462517 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7\": container with ID starting with 6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7 not found: ID does not exist" containerID="6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.462559 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7"} err="failed to get container status \"6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7\": rpc error: code = NotFound desc = could not find container \"6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7\": container with ID starting with 6cf8719d6b1a87652a79fd992096f0843e0bde57a8a5bba378a8197d9d7bf9e7 not found: ID does not exist" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.462589 4882 scope.go:117] "RemoveContainer" containerID="c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c" Oct 02 16:52:39 crc kubenswrapper[4882]: E1002 16:52:39.462941 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c\": container with ID starting with c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c not found: ID does not exist" containerID="c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.462983 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c"} err="failed to get container status \"c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c\": rpc error: code = NotFound desc = could not find container \"c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c\": container with ID starting with c4021ef0faade59afddda6c3f7c1d2a5f851a8ac8050731c9ac7bb9d9e26b16c not found: ID does not exist" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.463008 4882 scope.go:117] "RemoveContainer" containerID="96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c" Oct 02 16:52:39 crc kubenswrapper[4882]: E1002 16:52:39.463249 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c\": container with ID starting with 96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c not found: ID does not exist" containerID="96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c" Oct 02 16:52:39 crc kubenswrapper[4882]: I1002 16:52:39.463275 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c"} err="failed to get container status \"96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c\": rpc error: code = NotFound desc = could not find container \"96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c\": container with ID starting with 96b913a27d53fec44245d0483ac085c236df3931cf40dad834272fafcf0d4d5c not found: ID does not exist" Oct 02 16:52:40 crc kubenswrapper[4882]: I1002 16:52:40.771211 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" path="/var/lib/kubelet/pods/22fbd960-b1f0-4bee-8d96-0f448472e4ce/volumes" Oct 02 16:54:09 crc kubenswrapper[4882]: I1002 16:54:09.390330 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:54:09 crc kubenswrapper[4882]: I1002 16:54:09.390955 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:54:39 crc kubenswrapper[4882]: I1002 16:54:39.390586 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:54:39 crc kubenswrapper[4882]: I1002 16:54:39.391425 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.389953 4882 patch_prober.go:28] interesting pod/machine-config-daemon-jxblv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.390562 4882 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.390614 4882 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.391182 4882 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3"} pod="openshift-machine-config-operator/machine-config-daemon-jxblv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.391285 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerName="machine-config-daemon" containerID="cri-o://39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" gracePeriod=600 Oct 02 16:55:09 crc kubenswrapper[4882]: E1002 16:55:09.514583 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.650814 4882 generic.go:334] "Generic (PLEG): container finished" podID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" exitCode=0 Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.650889 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerDied","Data":"39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3"} Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.650926 4882 scope.go:117] "RemoveContainer" containerID="58c1b1c521b62f518e400e7167d0717c7838a933e2da2419084f725f5f609d17" Oct 02 16:55:09 crc kubenswrapper[4882]: I1002 16:55:09.651518 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:55:09 crc kubenswrapper[4882]: E1002 16:55:09.651820 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:55:21 crc kubenswrapper[4882]: I1002 16:55:21.760838 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:55:21 crc kubenswrapper[4882]: E1002 16:55:21.761618 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.991869 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k4942/must-gather-8j2d9"] Oct 02 16:55:27 crc kubenswrapper[4882]: E1002 16:55:27.993168 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerName="extract-content" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.993191 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerName="extract-content" Oct 02 16:55:27 crc kubenswrapper[4882]: E1002 16:55:27.993290 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9126f519-1686-415c-a2e2-25723a75a50b" containerName="extract-utilities" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.993304 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9126f519-1686-415c-a2e2-25723a75a50b" containerName="extract-utilities" Oct 02 16:55:27 crc kubenswrapper[4882]: E1002 16:55:27.993319 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerName="registry-server" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.993331 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerName="registry-server" Oct 02 16:55:27 crc kubenswrapper[4882]: E1002 16:55:27.993353 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerName="extract-utilities" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.993364 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerName="extract-utilities" Oct 02 16:55:27 crc kubenswrapper[4882]: E1002 16:55:27.993380 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9126f519-1686-415c-a2e2-25723a75a50b" containerName="registry-server" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.993390 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9126f519-1686-415c-a2e2-25723a75a50b" containerName="registry-server" Oct 02 16:55:27 crc kubenswrapper[4882]: E1002 16:55:27.993410 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9126f519-1686-415c-a2e2-25723a75a50b" containerName="extract-content" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.993421 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="9126f519-1686-415c-a2e2-25723a75a50b" containerName="extract-content" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.993650 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="22fbd960-b1f0-4bee-8d96-0f448472e4ce" containerName="registry-server" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.993705 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="9126f519-1686-415c-a2e2-25723a75a50b" containerName="registry-server" Oct 02 16:55:27 crc kubenswrapper[4882]: I1002 16:55:27.995123 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.000417 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k4942"/"openshift-service-ca.crt" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.000996 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k4942"/"kube-root-ca.crt" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.004858 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k4942/must-gather-8j2d9"] Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.187031 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-must-gather-output\") pod \"must-gather-8j2d9\" (UID: \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\") " pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.187101 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7qvd\" (UniqueName: \"kubernetes.io/projected/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-kube-api-access-d7qvd\") pod \"must-gather-8j2d9\" (UID: \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\") " pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.288345 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-must-gather-output\") pod \"must-gather-8j2d9\" (UID: \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\") " pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.288409 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7qvd\" (UniqueName: \"kubernetes.io/projected/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-kube-api-access-d7qvd\") pod \"must-gather-8j2d9\" (UID: \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\") " pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.288889 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-must-gather-output\") pod \"must-gather-8j2d9\" (UID: \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\") " pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.306034 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7qvd\" (UniqueName: \"kubernetes.io/projected/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-kube-api-access-d7qvd\") pod \"must-gather-8j2d9\" (UID: \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\") " pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.329341 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.519960 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k4942/must-gather-8j2d9"] Oct 02 16:55:28 crc kubenswrapper[4882]: I1002 16:55:28.840091 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4942/must-gather-8j2d9" event={"ID":"d56c4a94-c50c-452b-ae39-5bf98ad2cce7","Type":"ContainerStarted","Data":"3a26ee09f037b61c523a3ac9064dd553a756bd485516c4358988a379c9d164c9"} Oct 02 16:55:32 crc kubenswrapper[4882]: I1002 16:55:32.875547 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4942/must-gather-8j2d9" event={"ID":"d56c4a94-c50c-452b-ae39-5bf98ad2cce7","Type":"ContainerStarted","Data":"7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94"} Oct 02 16:55:32 crc kubenswrapper[4882]: I1002 16:55:32.876089 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4942/must-gather-8j2d9" event={"ID":"d56c4a94-c50c-452b-ae39-5bf98ad2cce7","Type":"ContainerStarted","Data":"17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396"} Oct 02 16:55:32 crc kubenswrapper[4882]: I1002 16:55:32.897331 4882 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k4942/must-gather-8j2d9" podStartSLOduration=2.196948168 podStartE2EDuration="5.897316649s" podCreationTimestamp="2025-10-02 16:55:27 +0000 UTC" firstStartedPulling="2025-10-02 16:55:28.531393802 +0000 UTC m=+2287.280623329" lastFinishedPulling="2025-10-02 16:55:32.231762283 +0000 UTC m=+2290.980991810" observedRunningTime="2025-10-02 16:55:32.895929317 +0000 UTC m=+2291.645158844" watchObservedRunningTime="2025-10-02 16:55:32.897316649 +0000 UTC m=+2291.646546176" Oct 02 16:55:34 crc kubenswrapper[4882]: I1002 16:55:34.760068 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:55:34 crc kubenswrapper[4882]: E1002 16:55:34.761312 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:55:48 crc kubenswrapper[4882]: I1002 16:55:48.760874 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:55:48 crc kubenswrapper[4882]: E1002 16:55:48.761666 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:56:03 crc kubenswrapper[4882]: I1002 16:56:03.759854 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:56:03 crc kubenswrapper[4882]: E1002 16:56:03.760623 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:56:17 crc kubenswrapper[4882]: I1002 16:56:17.760254 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:56:17 crc kubenswrapper[4882]: E1002 16:56:17.761001 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:56:24 crc kubenswrapper[4882]: I1002 16:56:24.459662 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb_9d37872a-965e-4d6c-8f57-688b99de18bb/util/0.log" Oct 02 16:56:24 crc kubenswrapper[4882]: I1002 16:56:24.662805 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb_9d37872a-965e-4d6c-8f57-688b99de18bb/util/0.log" Oct 02 16:56:24 crc kubenswrapper[4882]: I1002 16:56:24.680401 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb_9d37872a-965e-4d6c-8f57-688b99de18bb/pull/0.log" Oct 02 16:56:24 crc kubenswrapper[4882]: I1002 16:56:24.695914 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb_9d37872a-965e-4d6c-8f57-688b99de18bb/pull/0.log" Oct 02 16:56:24 crc kubenswrapper[4882]: I1002 16:56:24.827889 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb_9d37872a-965e-4d6c-8f57-688b99de18bb/pull/0.log" Oct 02 16:56:24 crc kubenswrapper[4882]: I1002 16:56:24.840769 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb_9d37872a-965e-4d6c-8f57-688b99de18bb/util/0.log" Oct 02 16:56:24 crc kubenswrapper[4882]: I1002 16:56:24.882080 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4f452196a73d56ffcf54d3cad60ba044b2d6d9247fcec585eed680a096r7zvb_9d37872a-965e-4d6c-8f57-688b99de18bb/extract/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.014274 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6d6d64fdcf-mx2hz_d32146d6-c9aa-4864-bab5-71beebc6c6ef/kube-rbac-proxy/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.081656 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6d6d64fdcf-mx2hz_d32146d6-c9aa-4864-bab5-71beebc6c6ef/manager/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.097011 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8686fd99f7-nwf78_cd42cabb-2afc-45b4-aecd-9d97980e0840/kube-rbac-proxy/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.232128 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8686fd99f7-nwf78_cd42cabb-2afc-45b4-aecd-9d97980e0840/manager/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.268425 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-j4mbs_aece5dae-37d5-44cc-b83c-611712686fbb/kube-rbac-proxy/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.310628 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-j4mbs_aece5dae-37d5-44cc-b83c-611712686fbb/manager/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.407403 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-d785ddfd5-2sngh_cdf5d013-2e3a-41f5-ad36-c870b219a572/kube-rbac-proxy/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.488818 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-d785ddfd5-2sngh_cdf5d013-2e3a-41f5-ad36-c870b219a572/manager/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.589686 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5ffbdb7ddf-4z9l6_56d9b46f-821f-44dd-8139-4b3d1c2b1149/kube-rbac-proxy/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.604017 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5ffbdb7ddf-4z9l6_56d9b46f-821f-44dd-8139-4b3d1c2b1149/manager/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.715400 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-586b66cf4f-t5nnp_aee0bda3-847c-41b1-bb44-b2bd8875c9a1/kube-rbac-proxy/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.777050 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-586b66cf4f-t5nnp_aee0bda3-847c-41b1-bb44-b2bd8875c9a1/manager/0.log" Oct 02 16:56:25 crc kubenswrapper[4882]: I1002 16:56:25.861271 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7c9978f67-lgpvs_8b6c43c1-61b7-42ea-b66e-a19e54e67b50/kube-rbac-proxy/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.017788 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7c9978f67-lgpvs_8b6c43c1-61b7-42ea-b66e-a19e54e67b50/manager/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.039319 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-59b5fc9845-krzwg_2c520177-6db1-41dd-808a-49f676f9870c/kube-rbac-proxy/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.072932 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-59b5fc9845-krzwg_2c520177-6db1-41dd-808a-49f676f9870c/manager/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.170665 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c9969c6c6-bhztc_131678a2-c80e-4990-85ef-9f8ed80cb4da/kube-rbac-proxy/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.316861 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c9969c6c6-bhztc_131678a2-c80e-4990-85ef-9f8ed80cb4da/manager/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.373766 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-66fdd975d9-qb4lj_08f8538a-699b-4ef0-9e48-da37040acdeb/manager/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.389817 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-66fdd975d9-qb4lj_08f8538a-699b-4ef0-9e48-da37040acdeb/kube-rbac-proxy/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.503304 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-696ff4bcdd-ddwmt_75e8c312-5fcf-4106-953f-85077c2485aa/kube-rbac-proxy/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.584338 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-696ff4bcdd-ddwmt_75e8c312-5fcf-4106-953f-85077c2485aa/manager/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.651368 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-549fb68678-hhcmt_8b1d8fdd-7f24-41c3-a604-7bd8df0972c4/kube-rbac-proxy/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.710173 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-549fb68678-hhcmt_8b1d8fdd-7f24-41c3-a604-7bd8df0972c4/manager/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.760150 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5b45478b88-x6jfr_63f23924-224e-40d8-901a-bbb56f30163d/kube-rbac-proxy/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.934587 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5b45478b88-x6jfr_63f23924-224e-40d8-901a-bbb56f30163d/manager/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.975716 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-b4444585c-dfckg_a50abab9-485d-40a2-80d7-5d3134a14908/manager/0.log" Oct 02 16:56:26 crc kubenswrapper[4882]: I1002 16:56:26.980180 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-b4444585c-dfckg_a50abab9-485d-40a2-80d7-5d3134a14908/kube-rbac-proxy/0.log" Oct 02 16:56:27 crc kubenswrapper[4882]: I1002 16:56:27.154718 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7774cdf765ssbw4_7a766286-4d8e-45ce-b444-3dcf4cd9bf57/kube-rbac-proxy/0.log" Oct 02 16:56:27 crc kubenswrapper[4882]: I1002 16:56:27.161286 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7774cdf765ssbw4_7a766286-4d8e-45ce-b444-3dcf4cd9bf57/manager/0.log" Oct 02 16:56:27 crc kubenswrapper[4882]: I1002 16:56:27.311757 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6c9cff6d55-xtjhj_96795030-dccf-45ce-9aa5-6104145b247d/kube-rbac-proxy/0.log" Oct 02 16:56:27 crc kubenswrapper[4882]: I1002 16:56:27.391032 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-bb8dc5db7-k2zgd_bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c/kube-rbac-proxy/0.log" Oct 02 16:56:27 crc kubenswrapper[4882]: I1002 16:56:27.593662 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rlckz_a5f7a30e-60f0-4b08-9c14-b2643f8a0bf3/registry-server/0.log" Oct 02 16:56:27 crc kubenswrapper[4882]: I1002 16:56:27.670145 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-bb8dc5db7-k2zgd_bb8a3d3b-4bcf-4067-a6b3-bbba3cc2994c/operator/0.log" Oct 02 16:56:27 crc kubenswrapper[4882]: I1002 16:56:27.781903 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-855d7949fc-tqmjh_e05cf5dc-1295-4227-9527-f22cac2d26a0/kube-rbac-proxy/0.log" Oct 02 16:56:27 crc kubenswrapper[4882]: I1002 16:56:27.964147 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-ccbfcb8c-hvnhq_968e86ab-e082-4b95-a520-67672bc1d662/kube-rbac-proxy/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.064814 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-855d7949fc-tqmjh_e05cf5dc-1295-4227-9527-f22cac2d26a0/manager/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.128411 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-ccbfcb8c-hvnhq_968e86ab-e082-4b95-a520-67672bc1d662/manager/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.141378 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6c9cff6d55-xtjhj_96795030-dccf-45ce-9aa5-6104145b247d/manager/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.169838 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-27tjx_6a6df481-1b16-40a6-b98a-efd2172c77d2/operator/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.316272 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-25zdk_8f5b9997-c12c-4ba9-9efe-649f9d03e52e/kube-rbac-proxy/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.339094 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-25zdk_8f5b9997-c12c-4ba9-9efe-649f9d03e52e/manager/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.390600 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5ffb97cddf-qmnwx_a430ae1a-9023-4c9d-a4ee-ed474e3bbd88/kube-rbac-proxy/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.509157 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5ffb97cddf-qmnwx_a430ae1a-9023-4c9d-a4ee-ed474e3bbd88/manager/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.544152 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-twxqz_7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7/kube-rbac-proxy/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.548841 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-twxqz_7f6d7bc1-3858-4c18-9c1a-d2ee4545c9c7/manager/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.702886 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5595cf6c95-5sm69_922b0b10-8ea8-4b15-b170-561690a29ff1/kube-rbac-proxy/0.log" Oct 02 16:56:28 crc kubenswrapper[4882]: I1002 16:56:28.703548 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5595cf6c95-5sm69_922b0b10-8ea8-4b15-b170-561690a29ff1/manager/0.log" Oct 02 16:56:30 crc kubenswrapper[4882]: I1002 16:56:30.760057 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:56:30 crc kubenswrapper[4882]: E1002 16:56:30.760404 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:56:43 crc kubenswrapper[4882]: I1002 16:56:43.518823 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7cgt8_5ec03001-e8a9-4e0f-a20a-ad45cb0542c6/control-plane-machine-set-operator/0.log" Oct 02 16:56:43 crc kubenswrapper[4882]: I1002 16:56:43.720339 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9snj8_ed430b79-ad1a-456d-be2f-6cb51f2564dc/kube-rbac-proxy/0.log" Oct 02 16:56:43 crc kubenswrapper[4882]: I1002 16:56:43.759865 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:56:43 crc kubenswrapper[4882]: E1002 16:56:43.760111 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:56:43 crc kubenswrapper[4882]: I1002 16:56:43.774492 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9snj8_ed430b79-ad1a-456d-be2f-6cb51f2564dc/machine-api-operator/0.log" Oct 02 16:56:54 crc kubenswrapper[4882]: I1002 16:56:54.760998 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:56:54 crc kubenswrapper[4882]: E1002 16:56:54.762594 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:56:54 crc kubenswrapper[4882]: I1002 16:56:54.922600 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-cws7m_d90adcf1-b8ee-4f9b-a722-430f2e782f1d/cert-manager-controller/0.log" Oct 02 16:56:55 crc kubenswrapper[4882]: I1002 16:56:55.095226 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-nntcr_1289b43b-8f30-4ec0-bfe8-e42890a1ebfb/cert-manager-webhook/0.log" Oct 02 16:56:55 crc kubenswrapper[4882]: I1002 16:56:55.099456 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-lb8mv_677982a9-5a39-436c-abca-e7a8d0ac0a6f/cert-manager-cainjector/0.log" Oct 02 16:57:06 crc kubenswrapper[4882]: I1002 16:57:06.146493 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-p26rf_a304df3f-c207-4b0e-93e4-62f68ae75159/nmstate-console-plugin/0.log" Oct 02 16:57:06 crc kubenswrapper[4882]: I1002 16:57:06.289986 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-9gswz_0239ce1d-4e1a-4ffa-b9b3-654250a881b8/nmstate-handler/0.log" Oct 02 16:57:06 crc kubenswrapper[4882]: I1002 16:57:06.339556 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-mf42j_fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e/nmstate-metrics/0.log" Oct 02 16:57:06 crc kubenswrapper[4882]: I1002 16:57:06.368022 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-mf42j_fe54acc0-8f5b-4a7a-8fd9-c8dec2c4ca3e/kube-rbac-proxy/0.log" Oct 02 16:57:06 crc kubenswrapper[4882]: I1002 16:57:06.489863 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-llkzt_c0a1d391-cb92-485c-b8a1-55f6383284af/nmstate-operator/0.log" Oct 02 16:57:06 crc kubenswrapper[4882]: I1002 16:57:06.548740 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-skbdp_aede390b-5a93-41e4-b404-29edaf0d16c9/nmstate-webhook/0.log" Oct 02 16:57:07 crc kubenswrapper[4882]: I1002 16:57:07.760088 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:57:07 crc kubenswrapper[4882]: E1002 16:57:07.760538 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:57:19 crc kubenswrapper[4882]: I1002 16:57:19.274493 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-9dk8h_3495a5a2-8e62-47ee-9f17-478981d8fa27/kube-rbac-proxy/0.log" Oct 02 16:57:19 crc kubenswrapper[4882]: I1002 16:57:19.649094 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-9dk8h_3495a5a2-8e62-47ee-9f17-478981d8fa27/controller/0.log" Oct 02 16:57:19 crc kubenswrapper[4882]: I1002 16:57:19.685066 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-z72g6_75484683-14ac-4898-becf-2faf5970437d/frr-k8s-webhook-server/0.log" Oct 02 16:57:19 crc kubenswrapper[4882]: I1002 16:57:19.758737 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-frr-files/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.013380 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-reloader/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.034589 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-frr-files/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.035919 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-reloader/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.057538 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-metrics/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.204248 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-frr-files/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.229033 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-reloader/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.230267 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-metrics/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.270353 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-metrics/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.465701 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-reloader/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.480565 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-frr-files/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.509694 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/cp-metrics/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.528390 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/controller/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.719871 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/kube-rbac-proxy/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.736108 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/frr-metrics/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.795652 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/kube-rbac-proxy-frr/0.log" Oct 02 16:57:20 crc kubenswrapper[4882]: I1002 16:57:20.897462 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/reloader/0.log" Oct 02 16:57:21 crc kubenswrapper[4882]: I1002 16:57:21.047983 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84d6bb9698-dxdmn_8efade14-68e0-4743-a2f7-e7b9eac6ceb2/manager/0.log" Oct 02 16:57:21 crc kubenswrapper[4882]: I1002 16:57:21.165391 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d58b8bdc4-n4klj_eac510c5-342f-431f-8b27-c6919e7703aa/webhook-server/0.log" Oct 02 16:57:21 crc kubenswrapper[4882]: I1002 16:57:21.610568 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xrh84_bb683c00-49a1-49b9-8ab5-5d519f4ad310/kube-rbac-proxy/0.log" Oct 02 16:57:21 crc kubenswrapper[4882]: I1002 16:57:21.735846 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zfhmf_aa396476-fa37-4443-af9a-1cf39f701d65/frr/0.log" Oct 02 16:57:21 crc kubenswrapper[4882]: I1002 16:57:21.760087 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:57:21 crc kubenswrapper[4882]: E1002 16:57:21.760358 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:57:21 crc kubenswrapper[4882]: I1002 16:57:21.813806 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xrh84_bb683c00-49a1-49b9-8ab5-5d519f4ad310/speaker/0.log" Oct 02 16:57:33 crc kubenswrapper[4882]: I1002 16:57:33.431090 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm_3dbaddb4-a180-45f5-af37-81327245a4dd/util/0.log" Oct 02 16:57:33 crc kubenswrapper[4882]: I1002 16:57:33.690311 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm_3dbaddb4-a180-45f5-af37-81327245a4dd/util/0.log" Oct 02 16:57:33 crc kubenswrapper[4882]: I1002 16:57:33.793519 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm_3dbaddb4-a180-45f5-af37-81327245a4dd/pull/0.log" Oct 02 16:57:33 crc kubenswrapper[4882]: I1002 16:57:33.820280 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm_3dbaddb4-a180-45f5-af37-81327245a4dd/pull/0.log" Oct 02 16:57:33 crc kubenswrapper[4882]: I1002 16:57:33.959435 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm_3dbaddb4-a180-45f5-af37-81327245a4dd/util/0.log" Oct 02 16:57:33 crc kubenswrapper[4882]: I1002 16:57:33.965451 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm_3dbaddb4-a180-45f5-af37-81327245a4dd/extract/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.027222 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l8vpm_3dbaddb4-a180-45f5-af37-81327245a4dd/pull/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.151242 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82_2968d005-f404-4387-ac4c-739ba42a465e/util/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.295866 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82_2968d005-f404-4387-ac4c-739ba42a465e/pull/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.296018 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82_2968d005-f404-4387-ac4c-739ba42a465e/pull/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.312896 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82_2968d005-f404-4387-ac4c-739ba42a465e/util/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.477458 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82_2968d005-f404-4387-ac4c-739ba42a465e/pull/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.506501 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82_2968d005-f404-4387-ac4c-739ba42a465e/util/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.507510 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2w8h82_2968d005-f404-4387-ac4c-739ba42a465e/extract/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.638013 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2qnvl_efdb4f07-c013-4677-86ec-bdda64483def/extract-utilities/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.760747 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:57:34 crc kubenswrapper[4882]: E1002 16:57:34.761258 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.805898 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2qnvl_efdb4f07-c013-4677-86ec-bdda64483def/extract-content/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.859656 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2qnvl_efdb4f07-c013-4677-86ec-bdda64483def/extract-content/0.log" Oct 02 16:57:34 crc kubenswrapper[4882]: I1002 16:57:34.869441 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2qnvl_efdb4f07-c013-4677-86ec-bdda64483def/extract-utilities/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.021308 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2qnvl_efdb4f07-c013-4677-86ec-bdda64483def/extract-utilities/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.038398 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2qnvl_efdb4f07-c013-4677-86ec-bdda64483def/extract-content/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.235331 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5wm7q_b276f479-6426-4f8f-acd1-9823370944e2/extract-utilities/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.382605 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2qnvl_efdb4f07-c013-4677-86ec-bdda64483def/registry-server/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.455826 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5wm7q_b276f479-6426-4f8f-acd1-9823370944e2/extract-utilities/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.464978 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5wm7q_b276f479-6426-4f8f-acd1-9823370944e2/extract-content/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.471653 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5wm7q_b276f479-6426-4f8f-acd1-9823370944e2/extract-content/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.644152 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5wm7q_b276f479-6426-4f8f-acd1-9823370944e2/extract-utilities/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.733409 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5wm7q_b276f479-6426-4f8f-acd1-9823370944e2/extract-content/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.864253 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488_bf113633-1b94-4991-89f2-7418c97f620f/util/0.log" Oct 02 16:57:35 crc kubenswrapper[4882]: I1002 16:57:35.984919 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5wm7q_b276f479-6426-4f8f-acd1-9823370944e2/registry-server/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.098747 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488_bf113633-1b94-4991-89f2-7418c97f620f/util/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.099694 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488_bf113633-1b94-4991-89f2-7418c97f620f/pull/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.125179 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488_bf113633-1b94-4991-89f2-7418c97f620f/pull/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.263911 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488_bf113633-1b94-4991-89f2-7418c97f620f/util/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.318877 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488_bf113633-1b94-4991-89f2-7418c97f620f/extract/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.322065 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c72488_bf113633-1b94-4991-89f2-7418c97f620f/pull/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.441073 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xfg74_9ce3e7a3-a9c7-420b-a4df-d0445f4b3651/marketplace-operator/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.513533 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hfrrl_6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb/extract-utilities/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.768477 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hfrrl_6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb/extract-utilities/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.818122 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hfrrl_6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb/extract-content/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.838764 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hfrrl_6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb/extract-content/0.log" Oct 02 16:57:36 crc kubenswrapper[4882]: I1002 16:57:36.988490 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hfrrl_6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb/extract-utilities/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.053664 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hfrrl_6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb/extract-content/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.098828 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zlgt4_4a42aadc-015c-40ed-9624-f6417e4e68fc/extract-utilities/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.131897 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hfrrl_6d9dce59-8ae4-4ed2-937a-0ddc0bf1cecb/registry-server/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.239770 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zlgt4_4a42aadc-015c-40ed-9624-f6417e4e68fc/extract-utilities/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.261443 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zlgt4_4a42aadc-015c-40ed-9624-f6417e4e68fc/extract-content/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.289076 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zlgt4_4a42aadc-015c-40ed-9624-f6417e4e68fc/extract-content/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.478680 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zlgt4_4a42aadc-015c-40ed-9624-f6417e4e68fc/extract-utilities/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.478986 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zlgt4_4a42aadc-015c-40ed-9624-f6417e4e68fc/extract-content/0.log" Oct 02 16:57:37 crc kubenswrapper[4882]: I1002 16:57:37.709246 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zlgt4_4a42aadc-015c-40ed-9624-f6417e4e68fc/registry-server/0.log" Oct 02 16:57:47 crc kubenswrapper[4882]: I1002 16:57:47.760355 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:57:47 crc kubenswrapper[4882]: E1002 16:57:47.761143 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:58:01 crc kubenswrapper[4882]: I1002 16:58:01.760248 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:58:01 crc kubenswrapper[4882]: E1002 16:58:01.761292 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:58:13 crc kubenswrapper[4882]: I1002 16:58:13.761129 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:58:13 crc kubenswrapper[4882]: E1002 16:58:13.762129 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:58:26 crc kubenswrapper[4882]: I1002 16:58:26.760538 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:58:26 crc kubenswrapper[4882]: E1002 16:58:26.761703 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:58:39 crc kubenswrapper[4882]: I1002 16:58:39.267881 4882 generic.go:334] "Generic (PLEG): container finished" podID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerID="17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396" exitCode=0 Oct 02 16:58:39 crc kubenswrapper[4882]: I1002 16:58:39.267997 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k4942/must-gather-8j2d9" event={"ID":"d56c4a94-c50c-452b-ae39-5bf98ad2cce7","Type":"ContainerDied","Data":"17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396"} Oct 02 16:58:39 crc kubenswrapper[4882]: I1002 16:58:39.268941 4882 scope.go:117] "RemoveContainer" containerID="17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396" Oct 02 16:58:39 crc kubenswrapper[4882]: I1002 16:58:39.760359 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:58:39 crc kubenswrapper[4882]: E1002 16:58:39.760693 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:58:39 crc kubenswrapper[4882]: I1002 16:58:39.872068 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k4942_must-gather-8j2d9_d56c4a94-c50c-452b-ae39-5bf98ad2cce7/gather/0.log" Oct 02 16:58:46 crc kubenswrapper[4882]: I1002 16:58:46.743457 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k4942/must-gather-8j2d9"] Oct 02 16:58:46 crc kubenswrapper[4882]: I1002 16:58:46.744121 4882 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-k4942/must-gather-8j2d9" podUID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerName="copy" containerID="cri-o://7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94" gracePeriod=2 Oct 02 16:58:46 crc kubenswrapper[4882]: I1002 16:58:46.751273 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k4942/must-gather-8j2d9"] Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.097038 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k4942_must-gather-8j2d9_d56c4a94-c50c-452b-ae39-5bf98ad2cce7/copy/0.log" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.097790 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.229607 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-must-gather-output\") pod \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\" (UID: \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\") " Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.229685 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7qvd\" (UniqueName: \"kubernetes.io/projected/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-kube-api-access-d7qvd\") pod \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\" (UID: \"d56c4a94-c50c-452b-ae39-5bf98ad2cce7\") " Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.235329 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-kube-api-access-d7qvd" (OuterVolumeSpecName: "kube-api-access-d7qvd") pod "d56c4a94-c50c-452b-ae39-5bf98ad2cce7" (UID: "d56c4a94-c50c-452b-ae39-5bf98ad2cce7"). InnerVolumeSpecName "kube-api-access-d7qvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.299420 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d56c4a94-c50c-452b-ae39-5bf98ad2cce7" (UID: "d56c4a94-c50c-452b-ae39-5bf98ad2cce7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.328059 4882 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k4942_must-gather-8j2d9_d56c4a94-c50c-452b-ae39-5bf98ad2cce7/copy/0.log" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.328688 4882 generic.go:334] "Generic (PLEG): container finished" podID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerID="7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94" exitCode=143 Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.328744 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k4942/must-gather-8j2d9" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.328762 4882 scope.go:117] "RemoveContainer" containerID="7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.332039 4882 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.332073 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7qvd\" (UniqueName: \"kubernetes.io/projected/d56c4a94-c50c-452b-ae39-5bf98ad2cce7-kube-api-access-d7qvd\") on node \"crc\" DevicePath \"\"" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.345640 4882 scope.go:117] "RemoveContainer" containerID="17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.413178 4882 scope.go:117] "RemoveContainer" containerID="7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94" Oct 02 16:58:47 crc kubenswrapper[4882]: E1002 16:58:47.413748 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94\": container with ID starting with 7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94 not found: ID does not exist" containerID="7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.413819 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94"} err="failed to get container status \"7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94\": rpc error: code = NotFound desc = could not find container \"7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94\": container with ID starting with 7f35fc6b9a67d745824cdeb2f62231716c27ff05f3c5f8009d5bb3a8b3fa0c94 not found: ID does not exist" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.413852 4882 scope.go:117] "RemoveContainer" containerID="17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396" Oct 02 16:58:47 crc kubenswrapper[4882]: E1002 16:58:47.414168 4882 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396\": container with ID starting with 17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396 not found: ID does not exist" containerID="17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396" Oct 02 16:58:47 crc kubenswrapper[4882]: I1002 16:58:47.414234 4882 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396"} err="failed to get container status \"17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396\": rpc error: code = NotFound desc = could not find container \"17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396\": container with ID starting with 17e241112a6316b57c8c4fea02272141fc0597736e6394c93aba467e67c4d396 not found: ID does not exist" Oct 02 16:58:48 crc kubenswrapper[4882]: I1002 16:58:48.770841 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" path="/var/lib/kubelet/pods/d56c4a94-c50c-452b-ae39-5bf98ad2cce7/volumes" Oct 02 16:58:52 crc kubenswrapper[4882]: I1002 16:58:52.763899 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:58:52 crc kubenswrapper[4882]: E1002 16:58:52.764493 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:59:07 crc kubenswrapper[4882]: I1002 16:59:07.760573 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:59:07 crc kubenswrapper[4882]: E1002 16:59:07.761548 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:59:18 crc kubenswrapper[4882]: I1002 16:59:18.760547 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:59:18 crc kubenswrapper[4882]: E1002 16:59:18.761269 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:59:31 crc kubenswrapper[4882]: I1002 16:59:31.760803 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:59:31 crc kubenswrapper[4882]: E1002 16:59:31.761573 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:59:43 crc kubenswrapper[4882]: I1002 16:59:43.760925 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:59:43 crc kubenswrapper[4882]: E1002 16:59:43.761630 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 16:59:56 crc kubenswrapper[4882]: I1002 16:59:56.760015 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 16:59:56 crc kubenswrapper[4882]: E1002 16:59:56.760801 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.147517 4882 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t"] Oct 02 17:00:00 crc kubenswrapper[4882]: E1002 17:00:00.148176 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerName="copy" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.148194 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerName="copy" Oct 02 17:00:00 crc kubenswrapper[4882]: E1002 17:00:00.148251 4882 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerName="gather" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.148261 4882 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerName="gather" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.148418 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerName="copy" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.148443 4882 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56c4a94-c50c-452b-ae39-5bf98ad2cce7" containerName="gather" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.148974 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.151201 4882 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.152757 4882 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.157787 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c8dc311-eb79-4091-bab1-78e7771813c6-config-volume\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.157975 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c8dc311-eb79-4091-bab1-78e7771813c6-secret-volume\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.158116 4882 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh7vg\" (UniqueName: \"kubernetes.io/projected/9c8dc311-eb79-4091-bab1-78e7771813c6-kube-api-access-wh7vg\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.171283 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t"] Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.259345 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c8dc311-eb79-4091-bab1-78e7771813c6-config-volume\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.259689 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c8dc311-eb79-4091-bab1-78e7771813c6-secret-volume\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.259818 4882 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh7vg\" (UniqueName: \"kubernetes.io/projected/9c8dc311-eb79-4091-bab1-78e7771813c6-kube-api-access-wh7vg\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.260315 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c8dc311-eb79-4091-bab1-78e7771813c6-config-volume\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.266007 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c8dc311-eb79-4091-bab1-78e7771813c6-secret-volume\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.276910 4882 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh7vg\" (UniqueName: \"kubernetes.io/projected/9c8dc311-eb79-4091-bab1-78e7771813c6-kube-api-access-wh7vg\") pod \"collect-profiles-29323740-j7g5t\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.476966 4882 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.881404 4882 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t"] Oct 02 17:00:00 crc kubenswrapper[4882]: I1002 17:00:00.904209 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" event={"ID":"9c8dc311-eb79-4091-bab1-78e7771813c6","Type":"ContainerStarted","Data":"856bf23df69cc000b4ff1ba4ca9d02b9af4456e065483a56a3cd0a7bfdbf6901"} Oct 02 17:00:01 crc kubenswrapper[4882]: I1002 17:00:01.915472 4882 generic.go:334] "Generic (PLEG): container finished" podID="9c8dc311-eb79-4091-bab1-78e7771813c6" containerID="2d4781811ec9fc43eed538ffb6600577a9ddd3cb7a45f1a9e6118e7248ba6cfc" exitCode=0 Oct 02 17:00:01 crc kubenswrapper[4882]: I1002 17:00:01.915544 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" event={"ID":"9c8dc311-eb79-4091-bab1-78e7771813c6","Type":"ContainerDied","Data":"2d4781811ec9fc43eed538ffb6600577a9ddd3cb7a45f1a9e6118e7248ba6cfc"} Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.200499 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.202182 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c8dc311-eb79-4091-bab1-78e7771813c6-secret-volume\") pod \"9c8dc311-eb79-4091-bab1-78e7771813c6\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.202243 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c8dc311-eb79-4091-bab1-78e7771813c6-config-volume\") pod \"9c8dc311-eb79-4091-bab1-78e7771813c6\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.202272 4882 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh7vg\" (UniqueName: \"kubernetes.io/projected/9c8dc311-eb79-4091-bab1-78e7771813c6-kube-api-access-wh7vg\") pod \"9c8dc311-eb79-4091-bab1-78e7771813c6\" (UID: \"9c8dc311-eb79-4091-bab1-78e7771813c6\") " Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.204305 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c8dc311-eb79-4091-bab1-78e7771813c6-config-volume" (OuterVolumeSpecName: "config-volume") pod "9c8dc311-eb79-4091-bab1-78e7771813c6" (UID: "9c8dc311-eb79-4091-bab1-78e7771813c6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.210516 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c8dc311-eb79-4091-bab1-78e7771813c6-kube-api-access-wh7vg" (OuterVolumeSpecName: "kube-api-access-wh7vg") pod "9c8dc311-eb79-4091-bab1-78e7771813c6" (UID: "9c8dc311-eb79-4091-bab1-78e7771813c6"). InnerVolumeSpecName "kube-api-access-wh7vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.210513 4882 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c8dc311-eb79-4091-bab1-78e7771813c6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9c8dc311-eb79-4091-bab1-78e7771813c6" (UID: "9c8dc311-eb79-4091-bab1-78e7771813c6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.303332 4882 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c8dc311-eb79-4091-bab1-78e7771813c6-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.303362 4882 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c8dc311-eb79-4091-bab1-78e7771813c6-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.303372 4882 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh7vg\" (UniqueName: \"kubernetes.io/projected/9c8dc311-eb79-4091-bab1-78e7771813c6-kube-api-access-wh7vg\") on node \"crc\" DevicePath \"\"" Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.936627 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" event={"ID":"9c8dc311-eb79-4091-bab1-78e7771813c6","Type":"ContainerDied","Data":"856bf23df69cc000b4ff1ba4ca9d02b9af4456e065483a56a3cd0a7bfdbf6901"} Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.936873 4882 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856bf23df69cc000b4ff1ba4ca9d02b9af4456e065483a56a3cd0a7bfdbf6901" Oct 02 17:00:03 crc kubenswrapper[4882]: I1002 17:00:03.936695 4882 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323740-j7g5t" Oct 02 17:00:04 crc kubenswrapper[4882]: I1002 17:00:04.318714 4882 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv"] Oct 02 17:00:04 crc kubenswrapper[4882]: I1002 17:00:04.324381 4882 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323695-6xdzv"] Oct 02 17:00:04 crc kubenswrapper[4882]: I1002 17:00:04.772190 4882 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f" path="/var/lib/kubelet/pods/e7c8d6a7-c7f0-4031-b9ab-70f4bc41b00f/volumes" Oct 02 17:00:07 crc kubenswrapper[4882]: I1002 17:00:07.759917 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 17:00:07 crc kubenswrapper[4882]: E1002 17:00:07.760499 4882 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jxblv_openshift-machine-config-operator(3bd74899-256b-4b2c-bcd7-51fb1d08991b)\"" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" podUID="3bd74899-256b-4b2c-bcd7-51fb1d08991b" Oct 02 17:00:18 crc kubenswrapper[4882]: I1002 17:00:18.760625 4882 scope.go:117] "RemoveContainer" containerID="39305be1724e62b949907bc1f65942485f7ef817af5664f5477b4e95ade203c3" Oct 02 17:00:19 crc kubenswrapper[4882]: I1002 17:00:19.068942 4882 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jxblv" event={"ID":"3bd74899-256b-4b2c-bcd7-51fb1d08991b","Type":"ContainerStarted","Data":"2c9afa10ced8b6a577633ef16dc10aa66820601a4e6cd62fb9dfb487fc02d2b5"} Oct 02 17:00:29 crc kubenswrapper[4882]: I1002 17:00:29.056386 4882 scope.go:117] "RemoveContainer" containerID="a45588ca405ead2bc095135f0f6b0f3444228a97b15a986680662512149858e6"